Dec 02 18:14:49 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 18:14:49 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:49 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 18:14:50 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 18:14:50 crc kubenswrapper[4878]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 18:14:50 crc kubenswrapper[4878]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 18:14:50 crc kubenswrapper[4878]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 18:14:50 crc kubenswrapper[4878]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 18:14:50 crc kubenswrapper[4878]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 18:14:50 crc kubenswrapper[4878]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.711945 4878 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717127 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717163 4878 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717175 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717184 4878 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717194 4878 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717204 4878 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717214 4878 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717222 4878 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717230 4878 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717243 4878 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717277 4878 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717285 4878 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717293 4878 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717301 4878 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717309 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717317 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717324 4878 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717332 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717357 4878 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717365 4878 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717373 4878 feature_gate.go:330] unrecognized feature gate: Example Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717380 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717388 4878 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717395 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717404 4878 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717411 4878 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717419 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717427 4878 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717434 4878 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717442 4878 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717451 4878 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717458 4878 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717466 4878 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717473 4878 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717482 4878 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717490 4878 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717498 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717506 4878 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717514 4878 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717522 4878 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717531 4878 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717539 4878 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717547 4878 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717555 4878 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717567 4878 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717577 4878 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717586 4878 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717594 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717604 4878 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717613 4878 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717620 4878 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717628 4878 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717636 4878 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717644 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717653 4878 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717660 4878 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717668 4878 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717678 4878 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717685 4878 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717693 4878 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717700 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717708 4878 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717715 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717723 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717730 4878 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717740 4878 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717750 4878 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717758 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717766 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717776 4878 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.717785 4878 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718105 4878 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718120 4878 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718134 4878 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718145 4878 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718156 4878 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718165 4878 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718177 4878 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718188 4878 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718198 4878 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718207 4878 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718217 4878 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718226 4878 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718241 4878 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718277 4878 flags.go:64] FLAG: --cgroup-root="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718286 4878 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718295 4878 flags.go:64] FLAG: --client-ca-file="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718304 4878 flags.go:64] FLAG: --cloud-config="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718313 4878 flags.go:64] FLAG: --cloud-provider="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718321 4878 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718334 4878 flags.go:64] FLAG: --cluster-domain="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718343 4878 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718353 4878 flags.go:64] FLAG: --config-dir="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718362 4878 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718371 4878 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718382 4878 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718391 4878 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718400 4878 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718409 4878 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718419 4878 flags.go:64] FLAG: --contention-profiling="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718427 4878 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718439 4878 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718449 4878 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718459 4878 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718469 4878 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718478 4878 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718487 4878 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718496 4878 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718506 4878 flags.go:64] FLAG: --enable-server="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718515 4878 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718526 4878 flags.go:64] FLAG: --event-burst="100" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718535 4878 flags.go:64] FLAG: --event-qps="50" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718544 4878 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718553 4878 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718562 4878 flags.go:64] FLAG: --eviction-hard="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718572 4878 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718581 4878 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718590 4878 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718599 4878 flags.go:64] FLAG: --eviction-soft="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718608 4878 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718619 4878 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718628 4878 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718638 4878 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718647 4878 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718657 4878 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718666 4878 flags.go:64] FLAG: --feature-gates="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718686 4878 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718695 4878 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718704 4878 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718714 4878 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718723 4878 flags.go:64] FLAG: --healthz-port="10248" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718732 4878 flags.go:64] FLAG: --help="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718740 4878 flags.go:64] FLAG: --hostname-override="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718749 4878 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718759 4878 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718768 4878 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718777 4878 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718786 4878 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718794 4878 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718804 4878 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718812 4878 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718821 4878 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718830 4878 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718838 4878 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718848 4878 flags.go:64] FLAG: --kube-reserved="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718857 4878 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718865 4878 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718874 4878 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718883 4878 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718892 4878 flags.go:64] FLAG: --lock-file="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718901 4878 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718910 4878 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718919 4878 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718941 4878 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718949 4878 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718959 4878 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718967 4878 flags.go:64] FLAG: --logging-format="text" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718976 4878 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718986 4878 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.718994 4878 flags.go:64] FLAG: --manifest-url="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719003 4878 flags.go:64] FLAG: --manifest-url-header="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719015 4878 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719025 4878 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719035 4878 flags.go:64] FLAG: --max-pods="110" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719045 4878 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719054 4878 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719064 4878 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719073 4878 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719082 4878 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719091 4878 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719100 4878 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719119 4878 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719128 4878 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719137 4878 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719146 4878 flags.go:64] FLAG: --pod-cidr="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719155 4878 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719170 4878 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719179 4878 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719188 4878 flags.go:64] FLAG: --pods-per-core="0" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719197 4878 flags.go:64] FLAG: --port="10250" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719205 4878 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719214 4878 flags.go:64] FLAG: --provider-id="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719223 4878 flags.go:64] FLAG: --qos-reserved="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719232 4878 flags.go:64] FLAG: --read-only-port="10255" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719268 4878 flags.go:64] FLAG: --register-node="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719278 4878 flags.go:64] FLAG: --register-schedulable="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719287 4878 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719301 4878 flags.go:64] FLAG: --registry-burst="10" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719310 4878 flags.go:64] FLAG: --registry-qps="5" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719319 4878 flags.go:64] FLAG: --reserved-cpus="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719328 4878 flags.go:64] FLAG: --reserved-memory="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719339 4878 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719348 4878 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719358 4878 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719367 4878 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719376 4878 flags.go:64] FLAG: --runonce="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719385 4878 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719394 4878 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719405 4878 flags.go:64] FLAG: --seccomp-default="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719414 4878 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719423 4878 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719432 4878 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719441 4878 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719450 4878 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719459 4878 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719467 4878 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719476 4878 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719485 4878 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719494 4878 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719503 4878 flags.go:64] FLAG: --system-cgroups="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719512 4878 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719525 4878 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719534 4878 flags.go:64] FLAG: --tls-cert-file="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719543 4878 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719555 4878 flags.go:64] FLAG: --tls-min-version="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719563 4878 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719572 4878 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719581 4878 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719590 4878 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719598 4878 flags.go:64] FLAG: --v="2" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719609 4878 flags.go:64] FLAG: --version="false" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719620 4878 flags.go:64] FLAG: --vmodule="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719630 4878 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.719640 4878 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719856 4878 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719868 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719877 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719885 4878 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719896 4878 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719905 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719914 4878 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719922 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719930 4878 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719939 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719949 4878 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719960 4878 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719970 4878 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719978 4878 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719987 4878 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.719995 4878 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720004 4878 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720012 4878 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720020 4878 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720028 4878 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720036 4878 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720043 4878 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720051 4878 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720059 4878 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720066 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720076 4878 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720086 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720095 4878 feature_gate.go:330] unrecognized feature gate: Example Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720103 4878 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720110 4878 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720118 4878 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720126 4878 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720133 4878 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720141 4878 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720150 4878 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720161 4878 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720169 4878 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720188 4878 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720198 4878 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720207 4878 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720217 4878 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720225 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720239 4878 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720268 4878 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720277 4878 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720285 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720299 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720307 4878 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720315 4878 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720323 4878 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720331 4878 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720339 4878 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720346 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720354 4878 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720362 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720371 4878 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720379 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720386 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720394 4878 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720401 4878 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720441 4878 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720450 4878 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720458 4878 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720466 4878 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720474 4878 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720481 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720489 4878 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720497 4878 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720505 4878 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720516 4878 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.720523 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.720536 4878 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.734358 4878 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.734425 4878 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734526 4878 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734539 4878 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734546 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734551 4878 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734555 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734561 4878 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734565 4878 feature_gate.go:330] unrecognized feature gate: Example Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734569 4878 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734573 4878 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734577 4878 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734581 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734584 4878 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734588 4878 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734593 4878 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734597 4878 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734600 4878 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734604 4878 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734609 4878 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734614 4878 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734618 4878 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734624 4878 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734631 4878 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734640 4878 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734645 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734653 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734659 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734664 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734670 4878 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734675 4878 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734681 4878 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734685 4878 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734691 4878 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734694 4878 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734699 4878 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734703 4878 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734707 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734711 4878 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734715 4878 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734719 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734723 4878 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734727 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734731 4878 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734734 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734738 4878 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734742 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734746 4878 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734749 4878 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734754 4878 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734759 4878 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734764 4878 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734768 4878 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734772 4878 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734778 4878 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734781 4878 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734785 4878 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734789 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734793 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734796 4878 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734800 4878 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734803 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734807 4878 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734810 4878 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734814 4878 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734817 4878 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734821 4878 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734825 4878 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734828 4878 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734834 4878 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734838 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734844 4878 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.734849 4878 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.734859 4878 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735032 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735041 4878 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735046 4878 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735050 4878 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735055 4878 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735059 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735063 4878 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735067 4878 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735071 4878 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735074 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735077 4878 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735081 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735085 4878 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735090 4878 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735093 4878 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735097 4878 feature_gate.go:330] unrecognized feature gate: Example Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735100 4878 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735104 4878 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735108 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735112 4878 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735115 4878 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735119 4878 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735122 4878 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735126 4878 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735130 4878 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735136 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735140 4878 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735143 4878 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735147 4878 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735150 4878 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735154 4878 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735157 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735161 4878 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735164 4878 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735167 4878 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735171 4878 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735174 4878 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735179 4878 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735183 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735186 4878 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735190 4878 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735193 4878 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735197 4878 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735200 4878 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735204 4878 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735210 4878 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735216 4878 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735221 4878 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735225 4878 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735229 4878 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735233 4878 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735241 4878 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735245 4878 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735268 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735274 4878 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735279 4878 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735283 4878 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735288 4878 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735292 4878 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735296 4878 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735300 4878 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735304 4878 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735308 4878 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735313 4878 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735317 4878 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735321 4878 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735325 4878 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735329 4878 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735333 4878 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735337 4878 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.735342 4878 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.735347 4878 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.735850 4878 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.739099 4878 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.739195 4878 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.739789 4878 server.go:997] "Starting client certificate rotation" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.739812 4878 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.740103 4878 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 02:02:45.392945858 +0000 UTC Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.740274 4878 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 919h47m54.652714583s for next certificate rotation Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.750229 4878 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.752482 4878 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.774839 4878 log.go:25] "Validated CRI v1 runtime API" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.802849 4878 log.go:25] "Validated CRI v1 image API" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.805614 4878 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.812079 4878 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-18-10-32-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.812144 4878 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.844374 4878 manager.go:217] Machine: {Timestamp:2025-12-02 18:14:50.83913269 +0000 UTC m=+0.528751611 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9efa06d4-630a-45f6-aefd-96e578b112dc BootID:eec7cc2e-918f-4f16-92ea-f02d5b5d5466 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:6c:c9:dc Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:6c:c9:dc Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:fc:ef:df Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a5:0b:40 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:62:96:49 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a8:9f:08 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7e:17:73:65:6b:2f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:92:c1:d3:4a:17:4a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.845004 4878 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.845225 4878 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.845614 4878 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.845867 4878 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.845907 4878 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.846429 4878 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.846455 4878 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.846615 4878 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.846643 4878 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.846835 4878 state_mem.go:36] "Initialized new in-memory state store" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.847297 4878 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.848445 4878 kubelet.go:418] "Attempting to sync node with API server" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.848474 4878 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.848502 4878 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.848521 4878 kubelet.go:324] "Adding apiserver pod source" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.848536 4878 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.860181 4878 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.860804 4878 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.861701 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.861721 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.861847 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.861886 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.864463 4878 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865408 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865462 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865482 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865500 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865531 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865550 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865568 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865592 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865610 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865626 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865645 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865660 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.865945 4878 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.866766 4878 server.go:1280] "Started kubelet" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.867115 4878 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.867176 4878 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.867302 4878 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.868218 4878 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 18:14:50 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.870358 4878 server.go:460] "Adding debug handlers to kubelet server" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.871665 4878 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.871962 4878 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.872549 4878 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.872583 4878 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.871999 4878 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 13:24:40.762214437 +0000 UTC Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.872760 4878 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.871316 4878 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.159:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d78a7bf7e4478 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 18:14:50.866697336 +0000 UTC m=+0.556316267,LastTimestamp:2025-12-02 18:14:50.866697336 +0000 UTC m=+0.556316267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.873088 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.873525 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="200ms" Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.877089 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.877202 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.878224 4878 factory.go:153] Registering CRI-O factory Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.878485 4878 factory.go:221] Registration of the crio container factory successfully Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.878752 4878 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.878911 4878 factory.go:55] Registering systemd factory Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.879074 4878 factory.go:221] Registration of the systemd container factory successfully Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.879305 4878 factory.go:103] Registering Raw factory Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.879489 4878 manager.go:1196] Started watching for new ooms in manager Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.880754 4878 manager.go:319] Starting recovery of all containers Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893569 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893621 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893640 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893657 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893672 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893685 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893699 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893713 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893730 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893744 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893758 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893771 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893785 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893801 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893818 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893832 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893846 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893859 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893876 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893889 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893902 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893914 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893929 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893943 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893956 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893969 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.893992 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894007 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894021 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894034 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894050 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894062 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894081 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894095 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894108 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894121 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894134 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894147 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894163 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894175 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894189 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894202 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894215 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894228 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894268 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894289 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894311 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894328 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894343 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894362 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894376 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894390 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894409 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894423 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894439 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894453 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894467 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894479 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894492 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894505 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894517 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894532 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894549 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894563 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894581 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894593 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894607 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894621 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894670 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894687 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894704 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894722 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894738 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894754 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894769 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894784 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894800 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894814 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894831 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894846 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894861 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894876 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894890 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894923 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894937 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894949 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894963 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894975 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.894989 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895004 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895020 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895033 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895048 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895060 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895076 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895090 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895110 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895124 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895137 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895151 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895165 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895178 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895193 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895207 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895236 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895293 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895315 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895335 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895351 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895365 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895380 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895395 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895409 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895423 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895437 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895450 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895463 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895477 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895490 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895502 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895517 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895531 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895544 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895557 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.895572 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.897954 4878 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898021 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898058 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898089 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898123 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898151 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898178 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898206 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898233 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898305 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898334 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898370 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898399 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898427 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898458 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898488 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898514 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898547 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898647 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898716 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898750 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898785 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898815 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898844 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898874 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898903 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898931 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898962 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.898993 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899021 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899051 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899079 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899110 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899145 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899173 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899202 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899232 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899331 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899363 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899391 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899421 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899449 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899482 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899515 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899545 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899574 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899603 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899633 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899660 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899693 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899721 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899748 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899780 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899809 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899836 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899865 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899893 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899922 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899951 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.899982 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900012 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900044 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900074 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900105 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900135 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900169 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900198 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900225 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900296 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900326 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900357 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900388 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900415 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900455 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900482 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900515 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900543 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900571 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900599 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900629 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900659 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900691 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900719 4878 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900745 4878 reconstruct.go:97] "Volume reconstruction finished" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.900767 4878 reconciler.go:26] "Reconciler: start to sync state" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.907750 4878 manager.go:324] Recovery completed Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.926230 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.931127 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.931170 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.931181 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.932724 4878 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.932740 4878 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.932761 4878 state_mem.go:36] "Initialized new in-memory state store" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.933726 4878 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.936258 4878 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.936397 4878 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 18:14:50 crc kubenswrapper[4878]: I1202 18:14:50.936496 4878 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.936613 4878 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 18:14:50 crc kubenswrapper[4878]: W1202 18:14:50.938002 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.938055 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:50 crc kubenswrapper[4878]: E1202 18:14:50.973350 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.036787 4878 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.073552 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.074524 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="400ms" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.174582 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.237359 4878 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.275725 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.376900 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.476321 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="800ms" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.477289 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.578336 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.638627 4878 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.679355 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.779894 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.869080 4878 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.873227 4878 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 06:07:34.737232312 +0000 UTC Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.873370 4878 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 659h52m42.863867664s for next certificate rotation Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.878161 4878 policy_none.go:49] "None policy: Start" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.879731 4878 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.879803 4878 state_mem.go:35] "Initializing new in-memory state store" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.881128 4878 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.951093 4878 manager.go:334] "Starting Device Plugin manager" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.951454 4878 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.951500 4878 server.go:79] "Starting device plugin registration server" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.952288 4878 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.952326 4878 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.952540 4878 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.952702 4878 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 18:14:51 crc kubenswrapper[4878]: I1202 18:14:51.952712 4878 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.963153 4878 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 18:14:51 crc kubenswrapper[4878]: W1202 18:14:51.979371 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:51 crc kubenswrapper[4878]: E1202 18:14:51.979473 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.053094 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.054721 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.054836 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.054891 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.054955 4878 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 18:14:52 crc kubenswrapper[4878]: E1202 18:14:52.055416 4878 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.256423 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.257609 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.257644 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.257654 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.257677 4878 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 18:14:52 crc kubenswrapper[4878]: E1202 18:14:52.258173 4878 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 02 18:14:52 crc kubenswrapper[4878]: E1202 18:14:52.277725 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="1.6s" Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.294173 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:52 crc kubenswrapper[4878]: E1202 18:14:52.294310 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.300557 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:52 crc kubenswrapper[4878]: E1202 18:14:52.300596 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.404839 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:52 crc kubenswrapper[4878]: E1202 18:14:52.405170 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.439411 4878 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.439534 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.440828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.440984 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.441055 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.441243 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.441419 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.441468 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.442703 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.442761 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.442780 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.442703 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.442864 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.442882 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.443073 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.443264 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.443309 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.444024 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.444057 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.444088 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.444367 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.444546 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.444666 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.444876 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.445007 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.445052 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446012 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446038 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446047 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446344 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446452 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446734 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446862 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.446902 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.447840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.447873 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.447887 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.448218 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.448254 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.448265 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.448412 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.448433 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.449331 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.449467 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.449556 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521123 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521341 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521449 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521542 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521633 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521745 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521826 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.521931 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.522010 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.522097 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.522190 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.522303 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.522395 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.522484 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.522576 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624312 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624384 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624417 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624448 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624476 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624500 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624525 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624552 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624577 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624600 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624628 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624654 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624674 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624694 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.624710 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625160 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625222 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625287 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625314 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625347 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625430 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625450 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625510 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625485 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625545 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625570 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625592 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625599 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625633 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.625765 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.659126 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.660628 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.660708 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.660728 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.660788 4878 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 18:14:52 crc kubenswrapper[4878]: E1202 18:14:52.661667 4878 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.774429 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.792414 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.799616 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.812983 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.818436 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.841989 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cc122eeb299937c69348606c5a9ac7458016ef6bb78496e5cc01c6b092389362 WatchSource:0}: Error finding container cc122eeb299937c69348606c5a9ac7458016ef6bb78496e5cc01c6b092389362: Status 404 returned error can't find the container with id cc122eeb299937c69348606c5a9ac7458016ef6bb78496e5cc01c6b092389362 Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.847439 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f2e293b7b196522f7ed9da2f35bc9521e5b3c69e17f4233525608682c064de87 WatchSource:0}: Error finding container f2e293b7b196522f7ed9da2f35bc9521e5b3c69e17f4233525608682c064de87: Status 404 returned error can't find the container with id f2e293b7b196522f7ed9da2f35bc9521e5b3c69e17f4233525608682c064de87 Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.850520 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-79861a5380306ace232d270c6ea7a5926b68695e0466f401df7b07f2baec0ea9 WatchSource:0}: Error finding container 79861a5380306ace232d270c6ea7a5926b68695e0466f401df7b07f2baec0ea9: Status 404 returned error can't find the container with id 79861a5380306ace232d270c6ea7a5926b68695e0466f401df7b07f2baec0ea9 Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.852349 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c55f99eff1ffc35d07b9d956dc1444536e6921f9e4a8928a7ee41ded098b2ef1 WatchSource:0}: Error finding container c55f99eff1ffc35d07b9d956dc1444536e6921f9e4a8928a7ee41ded098b2ef1: Status 404 returned error can't find the container with id c55f99eff1ffc35d07b9d956dc1444536e6921f9e4a8928a7ee41ded098b2ef1 Dec 02 18:14:52 crc kubenswrapper[4878]: W1202 18:14:52.853634 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ffaad4ad1c772538c3b2f7e71f7558c1d56770cfab82b6620718e3f9f362177a WatchSource:0}: Error finding container ffaad4ad1c772538c3b2f7e71f7558c1d56770cfab82b6620718e3f9f362177a: Status 404 returned error can't find the container with id ffaad4ad1c772538c3b2f7e71f7558c1d56770cfab82b6620718e3f9f362177a Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.868555 4878 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.943451 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f2e293b7b196522f7ed9da2f35bc9521e5b3c69e17f4233525608682c064de87"} Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.945301 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cc122eeb299937c69348606c5a9ac7458016ef6bb78496e5cc01c6b092389362"} Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.947678 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ffaad4ad1c772538c3b2f7e71f7558c1d56770cfab82b6620718e3f9f362177a"} Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.948756 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c55f99eff1ffc35d07b9d956dc1444536e6921f9e4a8928a7ee41ded098b2ef1"} Dec 02 18:14:52 crc kubenswrapper[4878]: I1202 18:14:52.949943 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79861a5380306ace232d270c6ea7a5926b68695e0466f401df7b07f2baec0ea9"} Dec 02 18:14:53 crc kubenswrapper[4878]: I1202 18:14:53.462851 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:53 crc kubenswrapper[4878]: I1202 18:14:53.465481 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:53 crc kubenswrapper[4878]: I1202 18:14:53.465537 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:53 crc kubenswrapper[4878]: I1202 18:14:53.465554 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:53 crc kubenswrapper[4878]: I1202 18:14:53.465593 4878 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 18:14:53 crc kubenswrapper[4878]: E1202 18:14:53.466053 4878 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 02 18:14:53 crc kubenswrapper[4878]: I1202 18:14:53.868475 4878 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:53 crc kubenswrapper[4878]: E1202 18:14:53.879039 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="3.2s" Dec 02 18:14:54 crc kubenswrapper[4878]: W1202 18:14:54.213106 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:54 crc kubenswrapper[4878]: E1202 18:14:54.213284 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:54 crc kubenswrapper[4878]: W1202 18:14:54.448684 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:54 crc kubenswrapper[4878]: E1202 18:14:54.448777 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:54 crc kubenswrapper[4878]: W1202 18:14:54.761227 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:54 crc kubenswrapper[4878]: E1202 18:14:54.761727 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.869178 4878 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.956986 4878 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="974294eb7de37152669af70564facc3363b144bbafbae6cc870ade3c14d7bb77" exitCode=0 Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.957141 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.957448 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"974294eb7de37152669af70564facc3363b144bbafbae6cc870ade3c14d7bb77"} Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.959029 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.959084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.959103 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.962083 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.962138 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb"} Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.962081 4878 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb" exitCode=0 Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.969467 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.969531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.969545 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.971495 4878 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5" exitCode=0 Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.971566 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5"} Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.971655 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.973022 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.973058 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.973069 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.974634 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05"} Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.974672 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418"} Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.976331 4878 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9" exitCode=0 Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.976359 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9"} Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.976434 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.977284 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.977305 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.977315 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.979762 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.980978 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.981002 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:54 crc kubenswrapper[4878]: I1202 18:14:54.981016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.066749 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.068273 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.068334 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.068350 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.068398 4878 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 18:14:55 crc kubenswrapper[4878]: E1202 18:14:55.069133 4878 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 02 18:14:55 crc kubenswrapper[4878]: W1202 18:14:55.317802 4878 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:55 crc kubenswrapper[4878]: E1202 18:14:55.317890 4878 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.868648 4878 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.982115 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab"} Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.982168 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc"} Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.982181 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003"} Dec 02 18:14:55 crc kubenswrapper[4878]: I1202 18:14:55.982192 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.005103 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.005165 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.005198 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.006345 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.006375 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.006385 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.007817 4878 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64" exitCode=0 Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.007884 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.007990 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.008598 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.008618 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.008627 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.011501 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.011474 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"846bca8513947813249de7f7ae6bab2da477bde17446a4b3714334f57895f121"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.012414 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.012453 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.012472 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.015385 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.015439 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.015453 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08"} Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.015561 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.016316 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.016343 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:56 crc kubenswrapper[4878]: I1202 18:14:56.016353 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.020483 4878 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac" exitCode=0 Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.020585 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac"} Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.020659 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.021789 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.021816 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.021828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.027656 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.028447 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.028534 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.028553 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.028539 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba"} Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.028760 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.029066 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.029118 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.029136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.029996 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030070 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030093 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030130 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030154 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030194 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030272 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.030306 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:57 crc kubenswrapper[4878]: I1202 18:14:57.634972 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.035817 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7"} Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.035880 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62"} Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.035897 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb"} Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.035910 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64"} Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.035942 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.035974 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.037210 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.037287 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.037302 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.037229 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.037373 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.037390 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.269528 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.271134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.271182 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.271194 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.271251 4878 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.589039 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.589299 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.591363 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.591418 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:58 crc kubenswrapper[4878]: I1202 18:14:58.591437 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.043818 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15"} Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.043937 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.043974 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.045335 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.045365 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.045376 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.046367 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.046404 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.046418 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:14:59 crc kubenswrapper[4878]: I1202 18:14:59.060028 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.046935 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.046944 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.048184 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.048220 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.048234 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.048417 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.048460 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.048478 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.068347 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.583988 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.584229 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.585866 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.586790 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:00 crc kubenswrapper[4878]: I1202 18:15:00.586992 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:01 crc kubenswrapper[4878]: I1202 18:15:01.050030 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:01 crc kubenswrapper[4878]: I1202 18:15:01.051015 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:01 crc kubenswrapper[4878]: I1202 18:15:01.051066 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:01 crc kubenswrapper[4878]: I1202 18:15:01.051085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:01 crc kubenswrapper[4878]: E1202 18:15:01.963276 4878 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.017591 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.017889 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.019669 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.019707 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.019758 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.584138 4878 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.584286 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.711181 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.711463 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.712916 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.712958 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.712975 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.830327 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.830524 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.834805 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.834875 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:03 crc kubenswrapper[4878]: I1202 18:15:03.834893 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:04 crc kubenswrapper[4878]: I1202 18:15:04.477912 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:04 crc kubenswrapper[4878]: I1202 18:15:04.478086 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:04 crc kubenswrapper[4878]: I1202 18:15:04.479637 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:04 crc kubenswrapper[4878]: I1202 18:15:04.479733 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:04 crc kubenswrapper[4878]: I1202 18:15:04.479762 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:04 crc kubenswrapper[4878]: I1202 18:15:04.486483 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:05 crc kubenswrapper[4878]: I1202 18:15:05.060895 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:05 crc kubenswrapper[4878]: I1202 18:15:05.062567 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:05 crc kubenswrapper[4878]: I1202 18:15:05.062608 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:05 crc kubenswrapper[4878]: I1202 18:15:05.062618 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:05 crc kubenswrapper[4878]: I1202 18:15:05.068307 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:06 crc kubenswrapper[4878]: I1202 18:15:06.064454 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:06 crc kubenswrapper[4878]: I1202 18:15:06.065731 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:06 crc kubenswrapper[4878]: I1202 18:15:06.065796 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:06 crc kubenswrapper[4878]: I1202 18:15:06.065820 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:06 crc kubenswrapper[4878]: I1202 18:15:06.869622 4878 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 02 18:15:07 crc kubenswrapper[4878]: E1202 18:15:07.080615 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 02 18:15:08 crc kubenswrapper[4878]: I1202 18:15:08.003218 4878 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 18:15:08 crc kubenswrapper[4878]: I1202 18:15:08.003336 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 18:15:08 crc kubenswrapper[4878]: I1202 18:15:08.020445 4878 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 18:15:08 crc kubenswrapper[4878]: I1202 18:15:08.020526 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 18:15:10 crc kubenswrapper[4878]: I1202 18:15:10.078359 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:15:10 crc kubenswrapper[4878]: I1202 18:15:10.078532 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:10 crc kubenswrapper[4878]: I1202 18:15:10.079578 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:10 crc kubenswrapper[4878]: I1202 18:15:10.079615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:10 crc kubenswrapper[4878]: I1202 18:15:10.079625 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:10 crc kubenswrapper[4878]: I1202 18:15:10.087596 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:15:11 crc kubenswrapper[4878]: I1202 18:15:11.078667 4878 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 18:15:11 crc kubenswrapper[4878]: I1202 18:15:11.078743 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:11 crc kubenswrapper[4878]: I1202 18:15:11.080141 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:11 crc kubenswrapper[4878]: I1202 18:15:11.080222 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:11 crc kubenswrapper[4878]: I1202 18:15:11.080460 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:11 crc kubenswrapper[4878]: E1202 18:15:11.963423 4878 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.009820 4878 trace.go:236] Trace[1446562519]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 18:14:59.815) (total time: 13194ms): Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[1446562519]: ---"Objects listed" error: 13194ms (18:15:13.009) Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[1446562519]: [13.19432756s] [13.19432756s] END Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.009852 4878 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.010575 4878 trace.go:236] Trace[294053289]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 18:14:59.690) (total time: 13320ms): Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[294053289]: ---"Objects listed" error: 13320ms (18:15:13.010) Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[294053289]: [13.320182733s] [13.320182733s] END Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.010617 4878 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.012775 4878 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.013641 4878 trace.go:236] Trace[1942565866]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 18:15:00.963) (total time: 12050ms): Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[1942565866]: ---"Objects listed" error: 12050ms (18:15:13.013) Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[1942565866]: [12.050563428s] [12.050563428s] END Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.013660 4878 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.013999 4878 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.015921 4878 trace.go:236] Trace[194521284]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 18:14:59.201) (total time: 13813ms): Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[194521284]: ---"Objects listed" error: 13813ms (18:15:13.015) Dec 02 18:15:13 crc kubenswrapper[4878]: Trace[194521284]: [13.813872412s] [13.813872412s] END Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.015965 4878 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.038830 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.040110 4878 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42396->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.040157 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42396->192.168.126.11:17697: read: connection reset by peer" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.040351 4878 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.040371 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.052621 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.450289 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.455925 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.866996 4878 apiserver.go:52] "Watching apiserver" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.871065 4878 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.871621 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.872037 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.872135 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.872310 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.872586 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.872609 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.872682 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.872737 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.872729 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.872598 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.873739 4878 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.877564 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.878001 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.878674 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.880525 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.880769 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.881072 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.883521 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.883753 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.885640 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919100 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919156 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919183 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919207 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919274 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919303 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919324 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919344 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919365 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919385 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919407 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919429 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919450 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919502 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919504 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919524 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919591 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919618 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919641 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919662 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919685 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919646 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919909 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919708 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.919981 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920033 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920086 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920276 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920079 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922331 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922360 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922387 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922411 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922436 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922460 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922482 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922503 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922526 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922547 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922568 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922588 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922614 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922634 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922657 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922688 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922709 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922728 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922748 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922786 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922807 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922826 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922854 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922878 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922910 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922931 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922953 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922975 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922997 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923019 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923044 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923073 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923100 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923124 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923148 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923171 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923194 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923216 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923269 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923294 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923315 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923341 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923393 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923416 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923439 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923462 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923483 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923507 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923534 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923557 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923583 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920457 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920486 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920570 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920637 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920773 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920969 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.920979 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921117 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921173 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921353 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921398 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921550 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921645 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921752 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921944 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.921977 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922043 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922134 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922202 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.922139 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923318 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923576 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923562 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923815 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.924048 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.924141 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.924698 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.924823 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.924924 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.925277 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.926065 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.926068 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.926306 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.926687 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.926776 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927013 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927033 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927156 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927432 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927469 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.923607 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927553 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927728 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927761 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927781 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927834 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927866 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927905 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927938 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927971 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.927981 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928003 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928038 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928069 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928100 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928135 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928201 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928564 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928596 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928597 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928624 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928654 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928688 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928712 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928735 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928759 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928786 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928808 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928831 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928837 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928906 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928931 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928956 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.928979 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929001 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929026 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929047 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929072 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929130 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929160 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929184 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929207 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929262 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929285 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929310 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929334 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929356 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929382 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929404 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929424 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929449 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929476 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929497 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929522 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929543 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929564 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929604 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929639 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929666 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929690 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929711 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929734 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929755 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929776 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929799 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929820 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929841 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929865 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929888 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929911 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929935 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929957 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929977 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930099 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930124 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930145 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930208 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930233 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930279 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930304 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930327 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930349 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930371 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930397 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930420 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930443 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930507 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930531 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930556 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930579 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930600 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930622 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930644 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930667 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930692 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930714 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930738 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930761 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930782 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930805 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930826 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930848 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930869 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930890 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930912 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930936 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930957 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930979 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931001 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931023 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931045 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931067 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931090 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931112 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931135 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931158 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931182 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931212 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929045 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929573 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929701 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929720 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.929809 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930024 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930040 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930060 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930260 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930355 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930358 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930467 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930631 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930677 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.930907 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931494 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931585 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.931748 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.932149 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.932469 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.932836 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.932858 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.932966 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.933164 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.933365 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.933460 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.933974 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.934320 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.933912 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.934912 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.934944 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.934971 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.934996 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935023 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935047 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935071 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935118 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935152 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935178 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935206 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935228 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935298 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935329 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935352 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935374 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935400 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935425 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935450 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935475 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935499 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935568 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935587 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935600 4878 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935614 4878 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935628 4878 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935644 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935656 4878 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935670 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935682 4878 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935694 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935710 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935721 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935732 4878 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935744 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935757 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935769 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935781 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935793 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935806 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935792 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935823 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935907 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.935943 4878 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936076 4878 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936109 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936118 4878 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936155 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936188 4878 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936217 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936281 4878 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936315 4878 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936347 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936379 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936411 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936440 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936470 4878 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936502 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936543 4878 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936570 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936584 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936596 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936625 4878 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936650 4878 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936675 4878 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936700 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936728 4878 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936760 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936788 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936817 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936847 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936879 4878 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936909 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936937 4878 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936964 4878 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936982 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.936990 4878 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937028 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937045 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937058 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937071 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937083 4878 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937098 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937111 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937124 4878 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937137 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937149 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937161 4878 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937173 4878 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937184 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937197 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937210 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937222 4878 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937250 4878 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937264 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937277 4878 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937288 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937299 4878 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937313 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937325 4878 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937337 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937393 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937405 4878 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937508 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937574 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.937648 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.938106 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.938186 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.939061 4878 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.940727 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.940952 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.941047 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:14.441005097 +0000 UTC m=+24.130624018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.941991 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.942035 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.942073 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.942445 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.942573 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.942787 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.943058 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.943095 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.943310 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.943412 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.943658 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.943851 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.944172 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.944627 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.945048 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.945079 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.945057 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.945747 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.946147 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.946213 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.946363 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.946575 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.946676 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.947638 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.947659 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.947893 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.947988 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.948046 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.948086 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.950901 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.951130 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:14.450750896 +0000 UTC m=+24.140369857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.965399 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:15:14.465370401 +0000 UTC m=+24.154989292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.967179 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.967209 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.967227 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.967344 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:14.467320114 +0000 UTC m=+24.156938995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.971423 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.971957 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.973030 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.973302 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.973604 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.973994 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.974190 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.974561 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.974714 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.974737 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.974747 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:13 crc kubenswrapper[4878]: E1202 18:15:13.974789 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:14.474775481 +0000 UTC m=+24.164394362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.974795 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.975060 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.975767 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.975834 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.976094 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.976732 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.977396 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.977428 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.977662 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.978315 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.978482 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.978492 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.978785 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.978969 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.979116 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.979446 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.979754 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.980119 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.980345 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.980635 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.980943 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.981014 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.981192 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.981288 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.987012 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.987621 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.988059 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.988414 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.988488 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.988977 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.989131 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.989308 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.989575 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.989751 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.989930 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.985177 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.990455 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.990749 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.992551 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.992649 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.992833 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.992985 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.994022 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.994088 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.994538 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.994630 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.994920 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.995144 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.995361 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.995614 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.996129 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.996337 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.996610 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.998334 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.998850 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:13 crc kubenswrapper[4878]: I1202 18:15:13.999657 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.000880 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.001153 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.001862 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.002165 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.002223 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.002436 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.003110 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.007796 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.013340 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.013783 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.013529 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.013550 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.013858 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.014120 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.016065 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.016069 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.016109 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.017040 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.017039 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.023216 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.033520 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038157 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038200 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038291 4878 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038310 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038324 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038338 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038354 4878 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038368 4878 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038381 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038393 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038410 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038423 4878 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038374 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038473 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038444 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038593 4878 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038605 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038618 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038632 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038644 4878 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038655 4878 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038666 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038677 4878 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038692 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038705 4878 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038718 4878 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038756 4878 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038788 4878 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038796 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038808 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038830 4878 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038848 4878 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038907 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038979 4878 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.038998 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039016 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039035 4878 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039080 4878 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039098 4878 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039118 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039135 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039152 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039169 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039187 4878 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039205 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039221 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039311 4878 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039332 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039350 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039399 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039417 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039434 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039451 4878 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039467 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039484 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039501 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039518 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039534 4878 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039553 4878 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039571 4878 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039587 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039604 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039621 4878 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039637 4878 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039655 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039672 4878 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039689 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039707 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039724 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039729 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039741 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039791 4878 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039812 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039832 4878 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039851 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039868 4878 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039886 4878 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039901 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039917 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039933 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039949 4878 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039964 4878 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039981 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.039998 4878 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040014 4878 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040031 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040049 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040066 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040082 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040098 4878 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040114 4878 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040130 4878 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040145 4878 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040162 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040179 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040197 4878 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040213 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040229 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040295 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040312 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040329 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040346 4878 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040363 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040379 4878 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040396 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040412 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040427 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040443 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040460 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040666 4878 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040681 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040697 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040712 4878 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040728 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040744 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040759 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040775 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040791 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.040807 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.045846 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.047314 4878 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.047456 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.052912 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.058684 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.069656 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.084504 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.091488 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.100631 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.102339 4878 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba" exitCode=255 Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.102447 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba"} Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.123787 4878 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.126014 4878 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.130473 4878 scope.go:117] "RemoveContainer" containerID="9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.131186 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.132645 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.142156 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.142475 4878 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.142554 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.169036 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.187337 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.200443 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.208433 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.212738 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.221633 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.222044 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: W1202 18:15:14.227817 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-47119f8d92d3a6d3d70cbcfc0c1bd8ebb7c2ad7808267f4a4919ed9a2c528e6d WatchSource:0}: Error finding container 47119f8d92d3a6d3d70cbcfc0c1bd8ebb7c2ad7808267f4a4919ed9a2c528e6d: Status 404 returned error can't find the container with id 47119f8d92d3a6d3d70cbcfc0c1bd8ebb7c2ad7808267f4a4919ed9a2c528e6d Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.236221 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: W1202 18:15:14.242353 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-9ac32c1926ebd3b8d74a14c6d86aec6f158785a1a468f87890b99c79a6eabe02 WatchSource:0}: Error finding container 9ac32c1926ebd3b8d74a14c6d86aec6f158785a1a468f87890b99c79a6eabe02: Status 404 returned error can't find the container with id 9ac32c1926ebd3b8d74a14c6d86aec6f158785a1a468f87890b99c79a6eabe02 Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.250179 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.267115 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.294073 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.444437 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.444627 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.444697 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:15.444677344 +0000 UTC m=+25.134296225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.545681 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.545809 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.545985 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:15:15.545913804 +0000 UTC m=+25.235532695 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546005 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546120 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.546121 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.546190 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546145 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546335 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546352 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546372 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546382 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546429 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:15.546381639 +0000 UTC m=+25.236000520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546460 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:15.546447831 +0000 UTC m=+25.236066912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:14 crc kubenswrapper[4878]: E1202 18:15:14.546504 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:15.546491272 +0000 UTC m=+25.236110403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.942932 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.943616 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.944705 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.945463 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.946157 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.947951 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.948725 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.949852 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.950615 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.951830 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.952482 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.953879 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.954495 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.955085 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.958974 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.959650 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.961268 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.961770 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.962602 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.963701 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.964253 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.965495 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.966030 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.967813 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.968406 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.969124 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.970578 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.971199 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.972622 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.973213 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.974125 4878 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.974303 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.976214 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.977640 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.978182 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.980157 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.981142 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.982405 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.983269 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.984628 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.985222 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.986557 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.987852 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.989200 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.989980 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.991046 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.991816 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.993262 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.993883 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.994968 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.995712 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.996882 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.997627 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 18:15:14 crc kubenswrapper[4878]: I1202 18:15:14.998259 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.008203 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p9jvp"] Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.008485 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-npvcg"] Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.008731 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.009076 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.022157 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.022284 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.022539 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.022560 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.026983 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.027014 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.027039 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.027057 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.058781 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.103685 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.108088 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff"} Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.108146 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9ac32c1926ebd3b8d74a14c6d86aec6f158785a1a468f87890b99c79a6eabe02"} Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.110126 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083"} Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.110321 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5"} Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.110336 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"47119f8d92d3a6d3d70cbcfc0c1bd8ebb7c2ad7808267f4a4919ed9a2c528e6d"} Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.111345 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3726f422e096e2f7eeb3b39aad6b1cdb87e0193b6821c13434d536f0aba96277"} Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.113690 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.115392 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1"} Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.116187 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.163128 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/11a20f23-e2bd-4df6-a47f-73b37f11cd8e-hosts-file\") pod \"node-resolver-p9jvp\" (UID: \"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\") " pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.163196 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/723bfeea-9234-4d2a-8492-747dc974d044-rootfs\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.163312 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/723bfeea-9234-4d2a-8492-747dc974d044-mcd-auth-proxy-config\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.163388 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28kvf\" (UniqueName: \"kubernetes.io/projected/11a20f23-e2bd-4df6-a47f-73b37f11cd8e-kube-api-access-28kvf\") pod \"node-resolver-p9jvp\" (UID: \"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\") " pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.163560 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/723bfeea-9234-4d2a-8492-747dc974d044-proxy-tls\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.163627 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d95kq\" (UniqueName: \"kubernetes.io/projected/723bfeea-9234-4d2a-8492-747dc974d044-kube-api-access-d95kq\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.167360 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.189633 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.232460 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.255703 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.264262 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/723bfeea-9234-4d2a-8492-747dc974d044-proxy-tls\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.264301 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d95kq\" (UniqueName: \"kubernetes.io/projected/723bfeea-9234-4d2a-8492-747dc974d044-kube-api-access-d95kq\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.264365 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/11a20f23-e2bd-4df6-a47f-73b37f11cd8e-hosts-file\") pod \"node-resolver-p9jvp\" (UID: \"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\") " pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.264380 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/723bfeea-9234-4d2a-8492-747dc974d044-rootfs\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.264406 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/723bfeea-9234-4d2a-8492-747dc974d044-mcd-auth-proxy-config\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.264421 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28kvf\" (UniqueName: \"kubernetes.io/projected/11a20f23-e2bd-4df6-a47f-73b37f11cd8e-kube-api-access-28kvf\") pod \"node-resolver-p9jvp\" (UID: \"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\") " pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.265146 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/723bfeea-9234-4d2a-8492-747dc974d044-rootfs\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.265726 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/11a20f23-e2bd-4df6-a47f-73b37f11cd8e-hosts-file\") pod \"node-resolver-p9jvp\" (UID: \"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\") " pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.266532 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/723bfeea-9234-4d2a-8492-747dc974d044-mcd-auth-proxy-config\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.271264 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/723bfeea-9234-4d2a-8492-747dc974d044-proxy-tls\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.271686 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.297389 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.297884 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d95kq\" (UniqueName: \"kubernetes.io/projected/723bfeea-9234-4d2a-8492-747dc974d044-kube-api-access-d95kq\") pod \"machine-config-daemon-npvcg\" (UID: \"723bfeea-9234-4d2a-8492-747dc974d044\") " pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.297981 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28kvf\" (UniqueName: \"kubernetes.io/projected/11a20f23-e2bd-4df6-a47f-73b37f11cd8e-kube-api-access-28kvf\") pod \"node-resolver-p9jvp\" (UID: \"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\") " pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.311953 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.327479 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.335784 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.342094 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p9jvp" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.356338 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.385948 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.397909 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.420341 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-fnpmk"] Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.421176 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.423496 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.424280 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.424441 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.424652 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.424811 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.433122 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.443090 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6cm9t"] Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.443888 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.451340 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.451694 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.459945 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.467176 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.467379 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.467465 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:17.46744135 +0000 UTC m=+27.157060241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.472152 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.485827 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.498900 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.510393 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.521864 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.535078 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.566767 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.568775 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.568915 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-cnibin\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569013 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569071 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-cnibin\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.569135 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:15:17.569083452 +0000 UTC m=+27.258702343 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569225 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c256c29c-e637-409f-a7b8-42db159198d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.569306 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.569366 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.569385 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.569475 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:17.569454384 +0000 UTC m=+27.259073485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569461 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569557 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-k8s-cni-cncf-io\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569624 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-cni-multus\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569678 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsv9h\" (UniqueName: \"kubernetes.io/projected/e79a8cec-20ba-4862-ba25-7de014466668-kube-api-access-wsv9h\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569712 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569765 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-system-cni-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569802 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-cni-bin\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569852 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-etc-kubernetes\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569882 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-os-release\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.569938 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.569993 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.570009 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.570072 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:17.570050833 +0000 UTC m=+27.259669714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.569961 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-hostroot\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570188 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e79a8cec-20ba-4862-ba25-7de014466668-cni-binary-copy\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570227 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-socket-dir-parent\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570303 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-kubelet\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570363 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.570446 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570461 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkz9\" (UniqueName: \"kubernetes.io/projected/c256c29c-e637-409f-a7b8-42db159198d6-kube-api-access-xmkz9\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.570512 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:17.570492677 +0000 UTC m=+27.260111758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570534 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-cni-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570563 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-conf-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570599 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-multus-certs\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570627 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-system-cni-dir\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570649 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c256c29c-e637-409f-a7b8-42db159198d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570699 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-os-release\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570762 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-netns\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.570786 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e79a8cec-20ba-4862-ba25-7de014466668-multus-daemon-config\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.597426 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.628607 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.653771 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.670900 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671390 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-conf-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671429 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-multus-certs\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671447 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-system-cni-dir\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671467 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c256c29c-e637-409f-a7b8-42db159198d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671488 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-os-release\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671503 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-netns\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671515 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-multus-certs\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671520 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e79a8cec-20ba-4862-ba25-7de014466668-multus-daemon-config\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671601 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-cnibin\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671623 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c256c29c-e637-409f-a7b8-42db159198d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671646 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671669 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-cnibin\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671694 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-k8s-cni-cncf-io\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671710 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-cni-multus\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671717 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-system-cni-dir\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671733 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsv9h\" (UniqueName: \"kubernetes.io/projected/e79a8cec-20ba-4862-ba25-7de014466668-kube-api-access-wsv9h\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671896 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-system-cni-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671912 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-os-release\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671929 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-cni-bin\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671988 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-etc-kubernetes\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672013 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-os-release\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672067 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-etc-kubernetes\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672101 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e79a8cec-20ba-4862-ba25-7de014466668-cni-binary-copy\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672119 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-cni-bin\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672113 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-k8s-cni-cncf-io\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672139 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-cnibin\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672167 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672202 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-os-release\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672153 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-cni-multus\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672157 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-socket-dir-parent\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672118 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c256c29c-e637-409f-a7b8-42db159198d6-cnibin\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672102 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-run-netns\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672273 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-hostroot\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672293 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-kubelet\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672305 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-hostroot\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672313 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-socket-dir-parent\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672334 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e79a8cec-20ba-4862-ba25-7de014466668-multus-daemon-config\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.671516 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-conf-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672345 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-host-var-lib-kubelet\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672321 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkz9\" (UniqueName: \"kubernetes.io/projected/c256c29c-e637-409f-a7b8-42db159198d6-kube-api-access-xmkz9\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672407 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-cni-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672470 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-system-cni-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672611 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e79a8cec-20ba-4862-ba25-7de014466668-multus-cni-dir\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.672863 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c256c29c-e637-409f-a7b8-42db159198d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.673023 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c256c29c-e637-409f-a7b8-42db159198d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.673427 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e79a8cec-20ba-4862-ba25-7de014466668-cni-binary-copy\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.696649 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.697339 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkz9\" (UniqueName: \"kubernetes.io/projected/c256c29c-e637-409f-a7b8-42db159198d6-kube-api-access-xmkz9\") pod \"multus-additional-cni-plugins-fnpmk\" (UID: \"c256c29c-e637-409f-a7b8-42db159198d6\") " pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.697733 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsv9h\" (UniqueName: \"kubernetes.io/projected/e79a8cec-20ba-4862-ba25-7de014466668-kube-api-access-wsv9h\") pod \"multus-6cm9t\" (UID: \"e79a8cec-20ba-4862-ba25-7de014466668\") " pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.708960 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.719942 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.728706 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.738806 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.744369 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.753288 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: W1202 18:15:15.756924 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc256c29c_e637_409f_a7b8_42db159198d6.slice/crio-069cc3de7f6a65fdf1d272a9c56b626ce2ff649a629e36744ed17545d2636d6c WatchSource:0}: Error finding container 069cc3de7f6a65fdf1d272a9c56b626ce2ff649a629e36744ed17545d2636d6c: Status 404 returned error can't find the container with id 069cc3de7f6a65fdf1d272a9c56b626ce2ff649a629e36744ed17545d2636d6c Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.771352 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.771693 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6cm9t" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.786193 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.797946 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.802663 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5jzn"] Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.804745 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.809986 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.810179 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.810293 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.810357 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.810428 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.810307 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.810535 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.822164 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.836850 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.852933 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.871155 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.888420 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.908003 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.924930 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.937543 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.937574 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.937609 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.937675 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.937745 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:15 crc kubenswrapper[4878]: E1202 18:15:15.937790 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.938612 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.951781 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.971937 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974522 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdfp\" (UniqueName: \"kubernetes.io/projected/d160cfa4-9e2a-429d-b760-0cac6d467b9a-kube-api-access-fzdfp\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974548 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-systemd-units\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974566 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-node-log\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974591 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-script-lib\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974605 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-env-overrides\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974619 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974633 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovn-node-metrics-cert\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974648 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-etc-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974667 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-log-socket\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974690 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-kubelet\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974706 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-var-lib-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974721 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974743 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-ovn\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974756 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-netns\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974769 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-netd\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974783 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-config\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974800 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-systemd\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974813 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-bin\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974826 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.974841 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-slash\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:15 crc kubenswrapper[4878]: I1202 18:15:15.991422 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:15Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.012748 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:16Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.030165 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:16Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.046199 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:16Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076347 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076398 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-slash\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076422 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdfp\" (UniqueName: \"kubernetes.io/projected/d160cfa4-9e2a-429d-b760-0cac6d467b9a-kube-api-access-fzdfp\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076444 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-systemd-units\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076461 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-node-log\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076486 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-script-lib\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076506 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-env-overrides\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076524 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-etc-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076542 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076562 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovn-node-metrics-cert\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076590 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-log-socket\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076619 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-kubelet\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076638 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-var-lib-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076658 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076692 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-ovn\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076706 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-netns\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076720 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-netd\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076735 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-config\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076750 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-bin\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076766 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-systemd\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076824 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-systemd\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076869 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.076900 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-slash\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077162 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-ovn\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077209 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-systemd-units\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077184 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-log-socket\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077267 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-kubelet\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077276 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-node-log\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077297 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-var-lib-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077324 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077351 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-netd\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077373 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-netns\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077545 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-etc-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077671 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-openvswitch\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077800 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-bin\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.077938 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-script-lib\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.078160 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-config\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.078491 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-env-overrides\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.082458 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovn-node-metrics-cert\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.105534 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdfp\" (UniqueName: \"kubernetes.io/projected/d160cfa4-9e2a-429d-b760-0cac6d467b9a-kube-api-access-fzdfp\") pod \"ovnkube-node-x5jzn\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.119625 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerStarted","Data":"069cc3de7f6a65fdf1d272a9c56b626ce2ff649a629e36744ed17545d2636d6c"} Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.120351 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p9jvp" event={"ID":"11a20f23-e2bd-4df6-a47f-73b37f11cd8e","Type":"ContainerStarted","Data":"0d0322ab15fc14aab1bfb83f1e6e3a10d9216dabc225af9785ce48a379442573"} Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.120382 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.124885 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73"} Dec 02 18:15:16 crc kubenswrapper[4878]: I1202 18:15:16.124910 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"50991735ccf8349b9a9abc68d4100e0dd2ab6f1bf52fc71e838dd2fccc8eb482"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.129465 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5" exitCode=0 Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.129554 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.130026 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"9ea70ad1f9a7692edc4a16dfbffd8396cc7a1c689f7232e16d9834aa0675949a"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.132064 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.133491 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerStarted","Data":"e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.133599 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerStarted","Data":"f64e9e0c4457c72ddbda292555bd4c7cee71cb61e8b191b3eefe016ac676c7fe"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.135552 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerStarted","Data":"ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.137089 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p9jvp" event={"ID":"11a20f23-e2bd-4df6-a47f-73b37f11cd8e","Type":"ContainerStarted","Data":"887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34"} Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.150772 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.168945 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.183214 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.201085 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.217314 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.233136 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.248015 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.263756 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.276912 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.298335 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.315811 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.336642 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.352459 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.384473 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.408217 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.426522 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.445032 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.460912 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.481156 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.493787 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.493925 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.494004 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:21.493979627 +0000 UTC m=+31.183598508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.494138 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.510688 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.524197 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.541501 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.577443 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.594265 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.594392 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.594430 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.594477 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.594577 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.594643 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:21.594624698 +0000 UTC m=+31.284243579 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.594718 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:15:21.59470728 +0000 UTC m=+31.284326161 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.594814 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.594841 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.594857 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.594914 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:21.594903627 +0000 UTC m=+31.284522508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.595106 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.595166 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.595187 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.595309 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:21.595270749 +0000 UTC m=+31.284889790 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.608974 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.670216 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.696675 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.751685 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:17Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.936677 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.936907 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.936938 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.937102 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:17 crc kubenswrapper[4878]: I1202 18:15:17.936982 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:17 crc kubenswrapper[4878]: E1202 18:15:17.937283 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.142805 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.144700 4878 generic.go:334] "Generic (PLEG): container finished" podID="c256c29c-e637-409f-a7b8-42db159198d6" containerID="ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b" exitCode=0 Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.144895 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerDied","Data":"ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.152642 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.152702 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.152721 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.152732 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.152743 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.152752 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024"} Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.162367 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.187064 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.221127 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.236740 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.259737 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.279923 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.293482 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.311918 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.329101 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.356251 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.370148 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.381169 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.392953 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.407792 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.420628 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.436582 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.457736 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.474542 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.501852 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.520948 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.536152 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.548891 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.563055 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.579015 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.596597 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.609198 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.627281 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:18 crc kubenswrapper[4878]: I1202 18:15:18.642359 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:18Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.157627 4878 generic.go:334] "Generic (PLEG): container finished" podID="c256c29c-e637-409f-a7b8-42db159198d6" containerID="955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451" exitCode=0 Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.157742 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerDied","Data":"955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.172377 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.189447 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.206939 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.232142 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.246878 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.262384 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.276210 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6ktxv"] Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.276610 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.278891 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.278980 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.279265 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.279372 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.285748 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.303776 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.318082 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.328726 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.344548 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.357334 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.370871 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.384434 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.396072 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.408227 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.415081 4878 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.416358 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvsq6\" (UniqueName: \"kubernetes.io/projected/5653c799-2a0f-4f9e-b719-ffb2642d1207-kube-api-access-hvsq6\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.416399 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5653c799-2a0f-4f9e-b719-ffb2642d1207-serviceca\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.416520 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5653c799-2a0f-4f9e-b719-ffb2642d1207-host\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.418178 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.418259 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.418277 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.418378 4878 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.421230 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.440261 4878 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.440663 4878 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.442090 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.442142 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.442156 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.442175 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.442188 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.456209 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.481819 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.490157 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.490212 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.490224 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.490257 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.490271 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.494635 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.512168 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.516818 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.516858 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.516869 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.516886 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.516899 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.517153 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvsq6\" (UniqueName: \"kubernetes.io/projected/5653c799-2a0f-4f9e-b719-ffb2642d1207-kube-api-access-hvsq6\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.517195 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5653c799-2a0f-4f9e-b719-ffb2642d1207-serviceca\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.517248 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5653c799-2a0f-4f9e-b719-ffb2642d1207-host\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.517304 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5653c799-2a0f-4f9e-b719-ffb2642d1207-host\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.518425 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5653c799-2a0f-4f9e-b719-ffb2642d1207-serviceca\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.527099 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.533085 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.535750 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvsq6\" (UniqueName: \"kubernetes.io/projected/5653c799-2a0f-4f9e-b719-ffb2642d1207-kube-api-access-hvsq6\") pod \"node-ca-6ktxv\" (UID: \"5653c799-2a0f-4f9e-b719-ffb2642d1207\") " pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.543134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.543183 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.543193 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.543209 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.543219 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.545546 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.558540 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.562464 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.565567 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.565605 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.565616 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.565635 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.565646 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.572940 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.577196 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.577376 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.579124 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.579165 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.579173 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.579194 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.579204 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.588316 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.594060 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6ktxv" Dec 02 18:15:19 crc kubenswrapper[4878]: W1202 18:15:19.609806 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5653c799_2a0f_4f9e_b719_ffb2642d1207.slice/crio-aaf3a0493c0f30f2c31de4a5e3a53cb0f81a596dfe69e5a5bd57a10647af08a4 WatchSource:0}: Error finding container aaf3a0493c0f30f2c31de4a5e3a53cb0f81a596dfe69e5a5bd57a10647af08a4: Status 404 returned error can't find the container with id aaf3a0493c0f30f2c31de4a5e3a53cb0f81a596dfe69e5a5bd57a10647af08a4 Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.616029 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.643257 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.658313 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.675886 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.681874 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.681923 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.681941 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.681960 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.681975 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.690396 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:19Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.785275 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.785315 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.785326 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.785351 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.785363 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.888055 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.888126 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.888434 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.888480 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.888501 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.937084 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.937129 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.937286 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.937298 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.938056 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:19 crc kubenswrapper[4878]: E1202 18:15:19.937962 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.991915 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.991970 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.991982 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.992004 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:19 crc kubenswrapper[4878]: I1202 18:15:19.992015 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:19Z","lastTransitionTime":"2025-12-02T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.095219 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.095280 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.095290 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.095316 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.095326 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.163085 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6ktxv" event={"ID":"5653c799-2a0f-4f9e-b719-ffb2642d1207","Type":"ContainerStarted","Data":"8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.163146 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6ktxv" event={"ID":"5653c799-2a0f-4f9e-b719-ffb2642d1207","Type":"ContainerStarted","Data":"aaf3a0493c0f30f2c31de4a5e3a53cb0f81a596dfe69e5a5bd57a10647af08a4"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.165582 4878 generic.go:334] "Generic (PLEG): container finished" podID="c256c29c-e637-409f-a7b8-42db159198d6" containerID="b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32" exitCode=0 Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.165640 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerDied","Data":"b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.179397 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.197632 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.197682 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.197691 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.197707 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.197717 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.198568 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.214474 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.233687 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.256456 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.272406 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.287711 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.301513 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.301512 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.301561 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.301572 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.301589 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.301598 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.325440 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.340270 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.359317 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.387584 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.404083 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.404129 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.404177 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.404197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.404209 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.407629 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.422425 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.434373 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.460502 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.476023 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.500425 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.509223 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.509300 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.509312 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.509331 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.509341 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.516852 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.531011 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.542589 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.556549 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.568873 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.580556 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.593140 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.603683 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.612351 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.612402 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.612416 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.612438 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.612452 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.617134 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.630271 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.642934 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.657094 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.715414 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.715456 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.715468 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.715484 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.715496 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.818441 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.818548 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.818572 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.818605 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.818625 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.921224 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.921290 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.921299 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.921315 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.921325 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:20Z","lastTransitionTime":"2025-12-02T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.961563 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.977082 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:20 crc kubenswrapper[4878]: I1202 18:15:20.996680 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.024017 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.024089 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.024107 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.024131 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.024146 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.024320 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.042395 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.059328 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.074106 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.088090 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.104419 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.126746 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.126786 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.126797 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.126813 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.126826 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.129687 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.144255 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.162185 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.171095 4878 generic.go:334] "Generic (PLEG): container finished" podID="c256c29c-e637-409f-a7b8-42db159198d6" containerID="36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea" exitCode=0 Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.171138 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerDied","Data":"36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.175878 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.177609 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.197487 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.210599 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.228514 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.231623 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.231667 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.231676 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.231691 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.231700 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.248919 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.264717 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.278897 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.292504 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.303722 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.321138 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.335827 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.336667 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.336704 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.336713 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.336729 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.336740 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.349985 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.366471 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.382043 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.396562 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.416993 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.439605 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.440968 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.441007 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.441018 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.441036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.441050 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.461419 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.537711 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.537894 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.537972 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:29.537952902 +0000 UTC m=+39.227571783 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.546223 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.546529 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.546545 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.546564 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.546576 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639204 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:15:29.639177391 +0000 UTC m=+39.328796272 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.639056 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.639599 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.639647 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.639676 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639834 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639859 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639861 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639917 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639912 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.640058 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:29.640021328 +0000 UTC m=+39.329640209 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639930 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.639872 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.640186 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:29.640156892 +0000 UTC m=+39.329775773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.640274 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:29.640248495 +0000 UTC m=+39.329867386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.649502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.649560 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.649575 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.649599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.649689 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.752563 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.752622 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.752635 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.752660 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.752676 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.855981 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.856062 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.856085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.856135 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.856179 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.937553 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.937764 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.938503 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.938574 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.938661 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:21 crc kubenswrapper[4878]: E1202 18:15:21.938747 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.960279 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.960342 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.960362 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.960389 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:21 crc kubenswrapper[4878]: I1202 18:15:21.960406 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:21Z","lastTransitionTime":"2025-12-02T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.062356 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.062394 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.062402 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.062416 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.062426 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.165025 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.165088 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.165103 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.165124 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.165138 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.184116 4878 generic.go:334] "Generic (PLEG): container finished" podID="c256c29c-e637-409f-a7b8-42db159198d6" containerID="19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849" exitCode=0 Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.184209 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerDied","Data":"19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.200804 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.215720 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.238971 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.260338 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.268943 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.269005 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.269023 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.269048 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.269062 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.274193 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.284504 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.303134 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.318843 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.332419 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.346936 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.362170 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.372039 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.372092 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.372102 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.372123 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.372133 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.377947 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.398674 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.421106 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.437610 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.475087 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.475158 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.475198 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.475287 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.475338 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.578320 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.578368 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.578379 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.578406 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.578419 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.680817 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.680863 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.680873 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.680889 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.680900 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.783860 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.783918 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.783928 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.783944 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.783954 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.887086 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.887132 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.887143 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.887158 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.887168 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.990488 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.990544 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.990557 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.990574 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:22 crc kubenswrapper[4878]: I1202 18:15:22.990587 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:22Z","lastTransitionTime":"2025-12-02T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.093967 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.094022 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.094037 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.094059 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.094072 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.191500 4878 generic.go:334] "Generic (PLEG): container finished" podID="c256c29c-e637-409f-a7b8-42db159198d6" containerID="c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a" exitCode=0 Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.191574 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerDied","Data":"c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.196404 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.196462 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.196478 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.196499 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.196515 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.198679 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.199821 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.199915 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.209621 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.229371 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.229485 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.229893 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.253327 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.271614 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.299183 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.301407 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.301452 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.301463 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.301481 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.301491 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.315686 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.331533 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.345656 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.363741 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.381031 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.396096 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.403490 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.403572 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.403588 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.403613 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.403629 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.408818 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.421078 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.433061 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.450572 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.470504 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.483727 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.499348 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.506831 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.506875 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.506888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.506906 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.506918 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.514674 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.530734 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.544282 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.556459 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.570131 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.585021 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.609203 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.609265 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.609280 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.609297 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.609311 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.613680 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.666441 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.685452 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.703532 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.712151 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.712208 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.712218 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.712249 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.712261 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.722891 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.738316 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:23Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.815496 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.815553 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.815566 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.815583 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.815596 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.919283 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.919313 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.919322 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.919354 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.919365 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:23Z","lastTransitionTime":"2025-12-02T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.936873 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.936982 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:23 crc kubenswrapper[4878]: E1202 18:15:23.937022 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:23 crc kubenswrapper[4878]: E1202 18:15:23.937163 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:23 crc kubenswrapper[4878]: I1202 18:15:23.937292 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:23 crc kubenswrapper[4878]: E1202 18:15:23.937441 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.022036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.022100 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.022115 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.022135 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.022149 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.124517 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.124605 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.124619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.124690 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.124710 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.209164 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" event={"ID":"c256c29c-e637-409f-a7b8-42db159198d6","Type":"ContainerStarted","Data":"e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.209349 4878 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.227980 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.228027 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.228036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.228054 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.228064 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.230298 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.250632 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.272633 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.287388 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.310122 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.326842 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.330586 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.330648 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.330660 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.330688 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.330701 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.340950 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.352612 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.367953 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.390875 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.404393 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.417830 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.429435 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.433049 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.433106 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.433121 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.433145 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.433163 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.446902 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.463662 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:24Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.536759 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.536820 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.536840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.536869 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.536887 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.639742 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.639779 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.639789 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.639806 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.639817 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.743454 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.743528 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.743541 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.743566 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.743580 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.846434 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.846467 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.846478 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.846495 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.846505 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.959781 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.959849 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.959875 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.959908 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:24 crc kubenswrapper[4878]: I1202 18:15:24.959931 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:24Z","lastTransitionTime":"2025-12-02T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.062597 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.062652 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.062672 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.062705 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.062722 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.165612 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.165670 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.165684 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.165704 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.165717 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.212675 4878 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.268773 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.268821 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.268833 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.268851 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.268865 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.372108 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.372175 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.372191 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.372215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.372231 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.475152 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.475210 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.475227 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.475264 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.475279 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.578044 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.578096 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.578109 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.578129 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.578142 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.680966 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.681010 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.681022 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.681037 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.681049 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.784273 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.784303 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.784312 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.784330 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.784340 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.887156 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.887197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.887210 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.887225 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.887249 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.937731 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.937777 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.937908 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:25 crc kubenswrapper[4878]: E1202 18:15:25.938026 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:25 crc kubenswrapper[4878]: E1202 18:15:25.938202 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:25 crc kubenswrapper[4878]: E1202 18:15:25.938463 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.989652 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.989945 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.990036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.990128 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:25 crc kubenswrapper[4878]: I1202 18:15:25.990209 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:25Z","lastTransitionTime":"2025-12-02T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.092767 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.092830 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.092847 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.092870 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.092886 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.198977 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.199032 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.199043 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.199059 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.199072 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.217692 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/0.log" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.222417 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891" exitCode=1 Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.222463 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.223309 4878 scope.go:117] "RemoveContainer" containerID="51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.244387 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.258859 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.285777 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.302450 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.302502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.302518 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.302538 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.302552 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.315873 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.340205 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.361987 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.377264 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.403291 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.405419 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.405471 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.405482 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.405505 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.405517 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.419337 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.442323 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.457677 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.479703 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.493795 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.508831 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.508888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.508903 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.508922 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.508934 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.510149 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.527404 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:26Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.611804 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.611834 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.611847 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.611860 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.611869 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.714487 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.714521 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.714535 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.714551 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.714561 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.817580 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.817631 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.817644 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.817667 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.817681 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.920095 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.920166 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.920180 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.920203 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:26 crc kubenswrapper[4878]: I1202 18:15:26.920218 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:26Z","lastTransitionTime":"2025-12-02T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.023290 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.023364 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.023378 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.023405 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.023419 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.126561 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.126600 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.126609 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.126622 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.126646 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.229557 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.229618 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.229635 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.229660 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.229678 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.231871 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/0.log" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.235974 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.236150 4878 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.259073 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.285915 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.317583 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.333171 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.333283 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.333313 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.333349 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.333374 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.353663 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.372881 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.391741 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.412374 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.433452 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.436080 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.436149 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.436175 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.436206 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.436230 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.451664 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.466495 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.482840 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.507438 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.527701 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.539316 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.539377 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.539389 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.539429 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.539442 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.548943 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.567904 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.641446 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.642507 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.642548 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.642560 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.642577 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.642593 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.664207 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.682209 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.700282 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.731211 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.745343 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.745384 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.745397 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.745433 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.745445 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.765430 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.785823 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.798958 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.810265 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.826366 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.841556 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.847607 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.847669 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.847694 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.847762 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.847789 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.857598 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.869637 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.886929 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.904380 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.916384 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:27Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.937486 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.937555 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.937486 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:27 crc kubenswrapper[4878]: E1202 18:15:27.937593 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:27 crc kubenswrapper[4878]: E1202 18:15:27.937674 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:27 crc kubenswrapper[4878]: E1202 18:15:27.937732 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.950291 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.950327 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.950338 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.950355 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:27 crc kubenswrapper[4878]: I1202 18:15:27.950367 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:27Z","lastTransitionTime":"2025-12-02T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.052807 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.052858 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.052870 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.052892 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.052906 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.156822 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.156878 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.156893 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.156914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.156927 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.242085 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/1.log" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.242760 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/0.log" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.251772 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289" exitCode=1 Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.251849 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.253003 4878 scope.go:117] "RemoveContainer" containerID="51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.254513 4878 scope.go:117] "RemoveContainer" containerID="1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289" Dec 02 18:15:28 crc kubenswrapper[4878]: E1202 18:15:28.254777 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.255092 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz"] Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.255912 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.260037 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.261694 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.261745 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.261753 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.261771 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.261783 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.272547 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.282102 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.297553 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.311185 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33175bd6-9016-4c27-a8e6-d96f75e9187c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.311252 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33175bd6-9016-4c27-a8e6-d96f75e9187c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.311370 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33175bd6-9016-4c27-a8e6-d96f75e9187c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.311956 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wm9j\" (UniqueName: \"kubernetes.io/projected/33175bd6-9016-4c27-a8e6-d96f75e9187c-kube-api-access-9wm9j\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.321603 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.337020 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.357455 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.364866 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.364934 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.364953 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.364977 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.364994 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.371514 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.386431 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.396693 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.412433 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.413453 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33175bd6-9016-4c27-a8e6-d96f75e9187c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.413515 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wm9j\" (UniqueName: \"kubernetes.io/projected/33175bd6-9016-4c27-a8e6-d96f75e9187c-kube-api-access-9wm9j\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.413566 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33175bd6-9016-4c27-a8e6-d96f75e9187c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.413589 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33175bd6-9016-4c27-a8e6-d96f75e9187c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.414112 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33175bd6-9016-4c27-a8e6-d96f75e9187c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.414369 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33175bd6-9016-4c27-a8e6-d96f75e9187c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.420844 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33175bd6-9016-4c27-a8e6-d96f75e9187c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.428365 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.430786 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wm9j\" (UniqueName: \"kubernetes.io/projected/33175bd6-9016-4c27-a8e6-d96f75e9187c-kube-api-access-9wm9j\") pod \"ovnkube-control-plane-749d76644c-gndfz\" (UID: \"33175bd6-9016-4c27-a8e6-d96f75e9187c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.444104 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.458555 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.467536 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.467599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.467615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.467642 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.467657 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.471767 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.484001 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.505493 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.524303 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.540079 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.559639 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.570703 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.570740 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.570750 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.570767 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.570776 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.574309 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.585403 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.591997 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: W1202 18:15:28.601569 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33175bd6_9016_4c27_a8e6_d96f75e9187c.slice/crio-4a84c6ebc961874f60f7bd3c477bb07eba1a0e66bb35bafc271f0239e42fba25 WatchSource:0}: Error finding container 4a84c6ebc961874f60f7bd3c477bb07eba1a0e66bb35bafc271f0239e42fba25: Status 404 returned error can't find the container with id 4a84c6ebc961874f60f7bd3c477bb07eba1a0e66bb35bafc271f0239e42fba25 Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.606556 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.618320 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.634115 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.651462 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.661837 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.676279 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.676323 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.676345 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.676366 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.676379 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.678431 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.690576 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.701318 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.713766 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.724830 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.740373 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:28Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.778971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.779016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.779030 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.779051 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.779062 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.881829 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.881881 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.881893 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.881910 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.881976 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.985134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.985189 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.985324 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.985348 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:28 crc kubenswrapper[4878]: I1202 18:15:28.985395 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:28Z","lastTransitionTime":"2025-12-02T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.088378 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.088423 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.088439 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.088457 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.088470 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.190631 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.190680 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.190692 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.190709 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.190721 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.258340 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" event={"ID":"33175bd6-9016-4c27-a8e6-d96f75e9187c","Type":"ContainerStarted","Data":"50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.258394 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" event={"ID":"33175bd6-9016-4c27-a8e6-d96f75e9187c","Type":"ContainerStarted","Data":"261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.258406 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" event={"ID":"33175bd6-9016-4c27-a8e6-d96f75e9187c","Type":"ContainerStarted","Data":"4a84c6ebc961874f60f7bd3c477bb07eba1a0e66bb35bafc271f0239e42fba25"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.260881 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/1.log" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.277933 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.293177 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.293233 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.293272 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.293299 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.293317 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.302665 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.329856 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.353462 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.355581 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-dlwt8"] Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.356153 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.356262 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.373094 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.384385 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.397442 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.397492 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.397504 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.397530 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.397543 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.403485 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.419712 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.425174 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.425250 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrs2\" (UniqueName: \"kubernetes.io/projected/09adc15b-14dd-4a05-b569-4168b9ced169-kube-api-access-czrs2\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.433994 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.449586 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.464859 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.481522 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.500604 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.500654 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.500671 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.500693 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.500711 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.500931 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.523585 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.525940 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.526006 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrs2\" (UniqueName: \"kubernetes.io/projected/09adc15b-14dd-4a05-b569-4168b9ced169-kube-api-access-czrs2\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.526160 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.526296 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:30.026271004 +0000 UTC m=+39.715889885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.542897 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrs2\" (UniqueName: \"kubernetes.io/projected/09adc15b-14dd-4a05-b569-4168b9ced169-kube-api-access-czrs2\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.543695 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.559050 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.580980 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.602070 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.607902 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.607972 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.608005 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.608037 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.608057 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.626782 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.626976 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.627067 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:45.627045168 +0000 UTC m=+55.316664049 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.627664 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.643476 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.659651 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.673416 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.689289 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.703658 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.711439 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.711490 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.711506 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.711534 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.711552 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.721008 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.727564 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.727802 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:15:45.727759191 +0000 UTC m=+55.417378082 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.727844 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.727884 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.727920 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728021 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728030 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728043 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728059 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728099 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:45.728089102 +0000 UTC m=+55.417707993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728021 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728119 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:45.728108813 +0000 UTC m=+55.417727704 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728127 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728136 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.728170 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:45.728162274 +0000 UTC m=+55.417781155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.732686 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.732716 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.732726 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.732745 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.732757 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.738145 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.749421 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.753012 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.756858 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.756912 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.756926 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.756947 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.756961 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.767781 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.771267 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.775068 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.775132 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.775145 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.775163 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.775175 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.785986 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.788940 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.792832 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.792866 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.792878 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.792896 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.792908 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.799977 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.806690 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.809682 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.809721 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.809735 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.809755 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.809767 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.811959 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.825184 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.825383 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.826732 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.826764 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.826776 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.826790 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.826802 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.829140 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.842033 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:29Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.929757 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.929824 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.929841 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.929865 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.929883 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:29Z","lastTransitionTime":"2025-12-02T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.937161 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.937161 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:29 crc kubenswrapper[4878]: I1202 18:15:29.937287 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.937441 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.937492 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:29 crc kubenswrapper[4878]: E1202 18:15:29.938024 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.031853 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:30 crc kubenswrapper[4878]: E1202 18:15:30.033284 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:30 crc kubenswrapper[4878]: E1202 18:15:30.033395 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:31.0333705 +0000 UTC m=+40.722989411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.033930 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.034061 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.034083 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.034117 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.034136 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.137758 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.137814 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.137825 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.137843 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.137857 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.240543 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.240608 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.240619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.240638 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.240649 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.342843 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.342898 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.342910 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.342949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.342964 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.446331 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.446426 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.446443 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.446474 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.446498 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.550152 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.550215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.550233 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.550304 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.550326 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.654343 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.654456 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.654481 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.654511 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.654535 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.757778 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.757888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.757912 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.757941 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.757962 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.860825 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.860877 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.860889 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.860905 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.860916 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.936922 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:30 crc kubenswrapper[4878]: E1202 18:15:30.937128 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.954385 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:30Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.963488 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.963540 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.963551 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.963570 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.963581 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:30Z","lastTransitionTime":"2025-12-02T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.972023 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:30Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:30 crc kubenswrapper[4878]: I1202 18:15:30.994684 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:30Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.010686 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.030486 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51b5bf1b7fad01925b51836ad8a58114876a73fbbdae525317cf4383120f3891\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:25Z\\\",\\\"message\\\":\\\") from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 18:15:25.753679 6189 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754076 6189 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:15:25.754381 6189 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 18:15:25.754416 6189 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 18:15:25.754490 6189 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 18:15:25.754521 6189 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 18:15:25.754541 6189 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 18:15:25.754546 6189 factory.go:656] Stopping watch factory\\\\nI1202 18:15:25.754570 6189 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 18:15:25.754726 6189 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.045152 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:31 crc kubenswrapper[4878]: E1202 18:15:31.045318 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:31 crc kubenswrapper[4878]: E1202 18:15:31.045390 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:33.045372563 +0000 UTC m=+42.734991454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.047722 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.063502 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.065978 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.066017 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.066030 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.066050 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.066060 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.075616 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.092503 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.105757 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.119581 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.135480 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.147838 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.173256 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.173291 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.173303 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.173317 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.173327 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.177670 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.194292 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.213429 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.227332 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:31Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.277516 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.277565 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.277576 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.277594 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.277641 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.380722 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.380761 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.380769 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.380784 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.380793 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.484038 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.484084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.484100 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.484119 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.484132 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.586965 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.587017 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.587028 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.587042 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.587056 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.689502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.689560 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.689571 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.689586 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.689597 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.793062 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.793101 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.793113 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.793132 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.793144 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.896368 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.896460 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.896480 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.896506 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.896527 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:31Z","lastTransitionTime":"2025-12-02T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.937501 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.937535 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:31 crc kubenswrapper[4878]: E1202 18:15:31.937661 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:31 crc kubenswrapper[4878]: E1202 18:15:31.937694 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:31 crc kubenswrapper[4878]: I1202 18:15:31.937708 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:31 crc kubenswrapper[4878]: E1202 18:15:31.937888 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.000284 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.000334 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.000346 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.000368 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.000382 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.102891 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.102949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.102968 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.102991 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.103008 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.206946 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.206998 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.207010 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.207031 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.207043 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.310297 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.310338 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.310351 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.310370 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.310383 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.413455 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.413550 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.413573 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.413615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.413639 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.517112 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.517196 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.517215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.517280 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.517303 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.620986 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.621057 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.621080 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.621109 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.621131 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.725538 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.725602 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.725615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.725634 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.725648 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.828286 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.828331 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.828340 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.828355 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.828365 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.930885 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.930961 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.930978 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.931003 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.931021 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:32Z","lastTransitionTime":"2025-12-02T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:32 crc kubenswrapper[4878]: I1202 18:15:32.937430 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:32 crc kubenswrapper[4878]: E1202 18:15:32.937614 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.033743 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.033786 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.033800 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.033816 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.033828 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.067923 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:33 crc kubenswrapper[4878]: E1202 18:15:33.068116 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:33 crc kubenswrapper[4878]: E1202 18:15:33.068177 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:37.068162262 +0000 UTC m=+46.757781143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.136498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.136582 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.136598 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.136619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.136638 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.240426 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.240485 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.240500 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.240523 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.240538 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.344461 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.344544 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.344566 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.344594 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.344617 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.448470 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.448516 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.448527 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.448548 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.448562 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.551115 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.551215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.551228 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.551467 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.551478 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.655188 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.655280 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.655300 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.655331 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.655351 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.757279 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.757320 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.757332 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.757348 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.757360 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.859718 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.859774 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.859790 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.859814 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.859832 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.936811 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:33 crc kubenswrapper[4878]: E1202 18:15:33.936960 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.936836 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.937019 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:33 crc kubenswrapper[4878]: E1202 18:15:33.937044 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:33 crc kubenswrapper[4878]: E1202 18:15:33.937215 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.962121 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.962171 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.962182 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.962205 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:33 crc kubenswrapper[4878]: I1202 18:15:33.962219 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:33Z","lastTransitionTime":"2025-12-02T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.065733 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.065784 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.065797 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.065814 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.065824 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.169207 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.169309 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.169326 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.169352 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.169367 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.272225 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.272299 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.272314 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.272336 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.272347 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.374679 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.374737 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.374757 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.374778 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.374791 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.477301 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.477353 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.477368 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.477391 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.477407 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.580416 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.580479 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.580496 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.580520 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.580537 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.683632 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.683688 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.683703 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.683722 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.683733 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.787119 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.787205 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.787272 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.787304 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.787326 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.889896 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.889927 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.889951 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.889965 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.889973 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.937832 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:34 crc kubenswrapper[4878]: E1202 18:15:34.937986 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.992372 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.992413 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.992436 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.992452 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:34 crc kubenswrapper[4878]: I1202 18:15:34.992463 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:34Z","lastTransitionTime":"2025-12-02T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.095945 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.095999 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.096012 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.096031 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.096046 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.199080 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.199140 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.199203 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.199269 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.199296 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.301892 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.301948 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.301967 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.301990 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.302001 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.405137 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.405211 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.405227 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.405276 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.405293 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.508490 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.508550 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.508562 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.508583 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.508597 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.611781 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.611845 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.611862 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.611885 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.611903 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.714757 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.714807 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.714820 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.714844 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.714858 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.817161 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.817276 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.817290 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.817337 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.817349 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.921029 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.921101 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.921115 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.921137 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.921153 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:35Z","lastTransitionTime":"2025-12-02T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.937457 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.937530 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:35 crc kubenswrapper[4878]: I1202 18:15:35.937492 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:35 crc kubenswrapper[4878]: E1202 18:15:35.937656 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:35 crc kubenswrapper[4878]: E1202 18:15:35.937792 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:35 crc kubenswrapper[4878]: E1202 18:15:35.937954 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.028195 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.028278 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.028295 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.028311 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.028324 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.134222 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.134343 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.134365 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.134393 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.134412 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.237114 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.237178 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.237190 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.237214 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.237230 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.340198 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.340300 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.340321 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.340538 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.340578 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.444104 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.444173 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.444196 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.444224 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.444276 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.546792 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.546828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.546840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.546857 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.546868 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.576060 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.577942 4878 scope.go:117] "RemoveContainer" containerID="1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289" Dec 02 18:15:36 crc kubenswrapper[4878]: E1202 18:15:36.578116 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.609495 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.624913 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.650053 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.650136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.650157 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.650189 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.650209 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.656617 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.673180 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.694563 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.713615 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.727179 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.747856 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.752410 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.752437 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.752450 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.752466 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.752478 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.767500 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.783587 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.804605 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.818906 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.834280 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.848158 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.855511 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.855566 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.855585 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.855614 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.855635 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.862010 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.876934 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.891509 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:36Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.937538 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:36 crc kubenswrapper[4878]: E1202 18:15:36.937734 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.957940 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.957979 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.958047 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.958065 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:36 crc kubenswrapper[4878]: I1202 18:15:36.958075 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:36Z","lastTransitionTime":"2025-12-02T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.061362 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.061440 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.061451 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.061471 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.061481 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.112672 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:37 crc kubenswrapper[4878]: E1202 18:15:37.112826 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:37 crc kubenswrapper[4878]: E1202 18:15:37.112904 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:15:45.112883411 +0000 UTC m=+54.802502312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.165465 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.165547 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.165603 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.165636 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.165666 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.268725 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.268803 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.268826 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.268855 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.268878 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.371738 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.371811 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.371838 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.371868 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.371892 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.475053 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.475101 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.475112 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.475128 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.475139 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.578498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.578583 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.578604 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.578630 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.578683 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.681742 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.681875 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.681949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.681980 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.682001 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.784979 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.785041 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.785055 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.785075 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.785091 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.887912 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.887979 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.887996 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.888022 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.888044 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.937601 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.937673 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.937629 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:37 crc kubenswrapper[4878]: E1202 18:15:37.937809 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:37 crc kubenswrapper[4878]: E1202 18:15:37.938004 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:37 crc kubenswrapper[4878]: E1202 18:15:37.938207 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.991824 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.991884 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.991902 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.991935 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:37 crc kubenswrapper[4878]: I1202 18:15:37.991959 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:37Z","lastTransitionTime":"2025-12-02T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.095496 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.095569 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.095613 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.095639 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.095653 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.198264 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.198321 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.198340 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.198369 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.198383 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.300743 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.300796 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.300806 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.300828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.300840 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.403156 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.403210 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.403221 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.403267 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.403280 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.506125 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.506182 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.506192 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.506212 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.506224 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.609128 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.609191 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.609202 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.609215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.609224 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.713109 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.713177 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.713195 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.713221 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.713340 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.816356 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.816423 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.816442 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.816473 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.816491 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.920638 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.920716 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.920736 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.920768 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.920790 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:38Z","lastTransitionTime":"2025-12-02T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:38 crc kubenswrapper[4878]: I1202 18:15:38.938046 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:38 crc kubenswrapper[4878]: E1202 18:15:38.938290 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.024084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.024149 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.024169 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.024197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.024320 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.127172 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.127245 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.127255 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.127277 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.127290 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.230344 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.230477 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.230497 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.230522 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.230541 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.333384 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.333491 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.333503 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.333526 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.333542 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.437348 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.437394 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.437412 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.437430 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.437442 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.540798 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.540837 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.540847 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.540865 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.540878 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.644521 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.644590 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.644614 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.644638 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.644653 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.746777 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.746828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.746839 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.746855 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.746868 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.850021 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.850093 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.850118 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.850150 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.850173 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.937508 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.937588 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.937537 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:39 crc kubenswrapper[4878]: E1202 18:15:39.937682 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:39 crc kubenswrapper[4878]: E1202 18:15:39.937845 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:39 crc kubenswrapper[4878]: E1202 18:15:39.937928 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.954659 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.954710 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.954723 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.954743 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:39 crc kubenswrapper[4878]: I1202 18:15:39.954755 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:39Z","lastTransitionTime":"2025-12-02T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.057197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.057279 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.057293 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.057335 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.057347 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.071513 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.071567 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.071581 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.071599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.071611 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: E1202 18:15:40.094732 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:40Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.100508 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.100563 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.100576 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.100596 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.100610 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: E1202 18:15:40.117815 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:40Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.123016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.123096 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.123121 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.123158 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.123182 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: E1202 18:15:40.141546 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:40Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.145998 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.146042 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.146057 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.146078 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.146092 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: E1202 18:15:40.163493 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:40Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.169937 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.170017 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.170035 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.170054 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.170070 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: E1202 18:15:40.186768 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:40Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:40 crc kubenswrapper[4878]: E1202 18:15:40.186905 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.188779 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.188810 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.188821 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.188841 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.188854 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.292457 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.292524 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.292543 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.292570 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.292591 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.396085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.396166 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.396189 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.396231 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.396288 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.499286 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.499345 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.499358 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.499376 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.499388 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.602576 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.602639 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.602654 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.602676 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.602690 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.706405 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.706464 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.706481 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.706508 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.706527 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.809615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.809676 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.809693 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.809716 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.809733 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.913129 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.913226 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.913278 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.913303 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.913319 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:40Z","lastTransitionTime":"2025-12-02T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.937581 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:40 crc kubenswrapper[4878]: E1202 18:15:40.937776 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.972378 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:40Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:40 crc kubenswrapper[4878]: I1202 18:15:40.994856 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:40Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.017149 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.017201 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.017214 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.017233 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.017265 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.028297 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.048331 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.072140 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.094203 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.112935 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.119918 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.120005 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.120020 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.120038 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.120051 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.132112 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.150778 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.165187 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.181013 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.195937 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.211886 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.224275 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.224333 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.224352 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.224382 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.224401 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.229396 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.243773 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.261544 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.278279 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:41Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.327050 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.327115 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.327133 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.327159 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.327177 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.431856 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.431903 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.431917 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.431938 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.431963 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.534988 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.535033 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.535044 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.535060 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.535073 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.637733 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.637784 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.637797 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.637814 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.637828 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.741140 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.741201 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.741215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.741302 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.741328 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.844419 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.844486 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.844507 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.844538 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.844556 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.937718 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:41 crc kubenswrapper[4878]: E1202 18:15:41.937877 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.937746 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.937738 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:41 crc kubenswrapper[4878]: E1202 18:15:41.938123 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:41 crc kubenswrapper[4878]: E1202 18:15:41.937968 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.948106 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.948175 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.948197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.948228 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:41 crc kubenswrapper[4878]: I1202 18:15:41.948301 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:41Z","lastTransitionTime":"2025-12-02T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.051117 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.051430 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.051494 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.051627 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.051690 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.155935 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.155993 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.156015 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.156034 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.156049 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.259761 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.260111 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.260177 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.260272 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.260339 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.362929 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.362996 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.363010 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.363036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.363054 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.466755 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.466804 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.466818 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.466838 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.466850 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.570138 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.570199 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.570222 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.570317 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.570342 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.673420 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.673516 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.673584 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.673609 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.673649 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.782082 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.783073 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.783221 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.783404 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.783564 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.886894 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.886973 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.886996 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.887027 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.887052 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.936931 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:42 crc kubenswrapper[4878]: E1202 18:15:42.937132 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.989904 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.989967 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.989988 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.990014 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:42 crc kubenswrapper[4878]: I1202 18:15:42.990031 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:42Z","lastTransitionTime":"2025-12-02T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.093009 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.093064 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.093083 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.093106 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.093123 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.196021 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.196081 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.196127 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.196157 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.196177 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.299073 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.299134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.299154 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.299183 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.299207 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.410513 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.410587 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.410605 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.410634 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.410652 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.513352 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.513399 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.513410 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.513430 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.513448 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.616907 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.616963 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.616979 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.617002 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.617020 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.635586 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.648657 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.664046 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.681571 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.695364 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.715835 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.720398 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.720453 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.720466 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.720487 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.720501 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.736366 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.755289 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.770433 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.788335 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.805337 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.823498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.823583 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.823607 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.823638 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.823660 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.824520 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.849170 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.872978 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.889433 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.911500 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.926709 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.926808 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.926835 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.926876 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.926834 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.926908 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:43Z","lastTransitionTime":"2025-12-02T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.936982 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.937067 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:43 crc kubenswrapper[4878]: E1202 18:15:43.937175 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.937417 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:43 crc kubenswrapper[4878]: E1202 18:15:43.937475 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:43 crc kubenswrapper[4878]: E1202 18:15:43.937529 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.942892 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:43 crc kubenswrapper[4878]: I1202 18:15:43.958096 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:43Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.030890 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.030952 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.030971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.031000 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.031019 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.135608 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.135692 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.135716 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.135752 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.135781 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.239299 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.239372 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.239394 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.239423 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.239441 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.342609 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.342676 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.342733 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.342760 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.342781 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.446321 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.446385 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.446402 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.446426 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.446444 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.550994 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.551062 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.551074 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.551098 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.551113 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.654967 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.655079 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.655106 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.655149 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.655173 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.758459 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.758505 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.758524 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.758542 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.758554 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.861965 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.862013 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.862022 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.862036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.862047 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.937145 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:44 crc kubenswrapper[4878]: E1202 18:15:44.937444 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.965020 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.965088 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.965098 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.965115 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:44 crc kubenswrapper[4878]: I1202 18:15:44.965138 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:44Z","lastTransitionTime":"2025-12-02T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.068292 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.068334 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.068344 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.068358 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.068369 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.171780 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.171858 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.171880 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.171904 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.171923 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.208658 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.208897 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.208989 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:16:01.208964177 +0000 UTC m=+70.898583268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.275492 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.275590 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.275613 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.275647 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.275670 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.378409 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.378474 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.378485 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.378506 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.378522 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.481844 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.481898 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.481914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.481937 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.481954 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.584765 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.584826 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.584837 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.584856 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.584867 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.687711 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.687773 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.687784 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.687805 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.687818 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.713445 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.713625 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.713691 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:16:17.713673938 +0000 UTC m=+87.403292819 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.790060 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.790116 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.790128 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.790149 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.790162 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.814661 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.814776 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.814814 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.814856 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.814942 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:16:17.814923678 +0000 UTC m=+87.504542559 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815003 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815053 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815092 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815053 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815115 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815142 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815170 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815070 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:16:17.815057962 +0000 UTC m=+87.504676843 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815223 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:16:17.815198157 +0000 UTC m=+87.504817078 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.815313 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:16:17.81529511 +0000 UTC m=+87.504914031 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.893270 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.893339 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.893357 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.893384 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.893403 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.937479 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.937532 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.937489 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.937670 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.937885 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:45 crc kubenswrapper[4878]: E1202 18:15:45.938009 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.997010 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.997056 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.997068 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.997086 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:45 crc kubenswrapper[4878]: I1202 18:15:45.997097 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:45Z","lastTransitionTime":"2025-12-02T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.100502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.100573 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.100591 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.100625 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.100641 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.203958 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.204019 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.204040 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.204066 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.204087 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.308037 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.308090 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.308106 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.308132 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.308148 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.411442 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.411494 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.411504 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.411526 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.411538 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.514619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.514695 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.514720 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.514748 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.514770 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.618313 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.618387 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.618404 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.618434 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.618451 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.721718 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.722227 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.722292 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.722324 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.722341 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.826357 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.826440 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.826455 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.826487 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.826503 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.928921 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.928971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.928981 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.928996 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.929007 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:46Z","lastTransitionTime":"2025-12-02T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:46 crc kubenswrapper[4878]: I1202 18:15:46.937403 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:46 crc kubenswrapper[4878]: E1202 18:15:46.937635 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.031769 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.031820 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.031832 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.031852 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.031865 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.136087 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.136142 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.136154 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.136174 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.136189 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.239149 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.239196 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.239209 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.239229 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.239263 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.341714 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.341780 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.341788 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.341803 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.341813 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.444268 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.444620 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.444697 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.444768 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.444831 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.547922 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.547963 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.547975 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.547993 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.548005 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.652071 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.652122 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.652137 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.652156 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.652168 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.755323 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.755402 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.755428 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.755460 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.755482 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.858993 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.859092 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.859120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.859155 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.859185 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.937440 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.937540 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.937610 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:47 crc kubenswrapper[4878]: E1202 18:15:47.937633 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:47 crc kubenswrapper[4878]: E1202 18:15:47.937899 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:47 crc kubenswrapper[4878]: E1202 18:15:47.938006 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.962345 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.962406 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.962421 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.962445 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:47 crc kubenswrapper[4878]: I1202 18:15:47.962464 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:47Z","lastTransitionTime":"2025-12-02T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.065902 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.066016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.066042 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.066072 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.066095 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.169440 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.169506 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.169523 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.169552 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.169578 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.272521 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.272576 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.272594 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.272618 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.272635 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.374925 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.374964 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.374977 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.374993 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.375004 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.477499 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.477545 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.477557 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.477574 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.477585 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.580818 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.580851 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.580859 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.580874 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.580882 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.685364 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.685454 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.685476 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.685507 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.685529 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.787717 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.787756 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.787766 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.787780 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.787790 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.891087 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.891130 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.891142 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.891162 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.891176 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.937345 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:48 crc kubenswrapper[4878]: E1202 18:15:48.937552 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.994474 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.994515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.994523 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.994540 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:48 crc kubenswrapper[4878]: I1202 18:15:48.994554 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:48Z","lastTransitionTime":"2025-12-02T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.097438 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.097535 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.097548 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.097570 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.097583 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.200045 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.200101 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.200113 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.200140 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.200155 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.302723 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.302770 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.302781 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.302802 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.302814 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.405885 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.405944 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.405954 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.405975 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.405986 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.509075 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.509168 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.509183 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.509210 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.509224 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.611602 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.611658 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.611672 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.611693 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.611791 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.714873 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.714924 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.714939 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.714955 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.715396 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.818013 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.818079 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.818095 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.818120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.818132 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.922294 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.922346 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.922357 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.922381 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.922393 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:49Z","lastTransitionTime":"2025-12-02T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.937882 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.937906 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:49 crc kubenswrapper[4878]: I1202 18:15:49.937948 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:49 crc kubenswrapper[4878]: E1202 18:15:49.938073 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:49 crc kubenswrapper[4878]: E1202 18:15:49.938388 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:49 crc kubenswrapper[4878]: E1202 18:15:49.938554 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.026872 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.026914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.026937 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.026962 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.026978 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.130072 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.130174 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.130189 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.130207 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.130219 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.233519 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.233593 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.233612 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.233637 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.233658 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.336385 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.336461 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.336485 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.336515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.336539 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.448609 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.448684 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.448715 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.448745 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.448768 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: E1202 18:15:50.472142 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.480764 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.480796 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.480805 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.480820 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.480831 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: E1202 18:15:50.498378 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.502506 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.502544 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.502555 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.502570 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.502581 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: E1202 18:15:50.515000 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.518769 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.518834 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.518851 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.518877 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.518898 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: E1202 18:15:50.540393 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.545909 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.545949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.545965 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.545984 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.545999 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: E1202 18:15:50.560261 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:50 crc kubenswrapper[4878]: E1202 18:15:50.560382 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.562105 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.562155 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.562170 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.562189 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.562204 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.665851 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.665933 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.665958 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.665984 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.666003 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.769503 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.769571 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.769592 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.769621 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.769643 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.872613 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.872716 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.872742 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.872774 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.872798 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.937307 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:50 crc kubenswrapper[4878]: E1202 18:15:50.937514 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.937639 4878 scope.go:117] "RemoveContainer" containerID="1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.951160 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.972694 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.974984 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.975038 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.975050 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.975071 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.975084 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:50Z","lastTransitionTime":"2025-12-02T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:50 crc kubenswrapper[4878]: I1202 18:15:50.989058 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:50Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.002686 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.016496 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.029970 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.042714 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.060680 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.078229 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.078280 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.078292 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.078309 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.078322 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.078364 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.099158 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.119044 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.136846 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.153741 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.177656 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.181840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.181876 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.181888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.181908 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.181922 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.194444 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.213588 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.229900 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.244612 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.285393 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.285441 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.285450 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.285466 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.285476 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.348675 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/1.log" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.351831 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.352477 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.364741 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.381261 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.387351 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.387391 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.387405 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.387430 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.387442 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.399516 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.415291 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.429073 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.443270 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.455052 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.464834 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.474801 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.488833 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.489932 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.489967 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.489977 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.489992 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.490002 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.503107 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.527627 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.541637 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.561087 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.572070 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.586677 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.593218 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.593277 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.593288 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.593308 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.593321 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.603828 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.623881 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:51Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.730130 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.730182 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.730196 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.730215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.730249 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.834307 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.834364 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.834383 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.834408 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.834422 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937093 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937182 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937217 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937293 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937306 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937321 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937338 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:51Z","lastTransitionTime":"2025-12-02T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:51 crc kubenswrapper[4878]: E1202 18:15:51.937482 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:51 crc kubenswrapper[4878]: E1202 18:15:51.937300 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:51 crc kubenswrapper[4878]: I1202 18:15:51.937525 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:51 crc kubenswrapper[4878]: E1202 18:15:51.937607 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.040119 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.040187 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.040205 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.040259 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.040279 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.142634 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.142689 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.142713 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.142734 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.142757 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.246138 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.246205 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.246220 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.246262 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.246276 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.350152 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.350226 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.350302 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.350331 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.350350 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.453391 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.453472 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.453490 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.453515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.453532 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.557166 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.557222 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.557252 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.557273 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.557285 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.660259 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.660321 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.660333 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.660355 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.660368 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.763613 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.763665 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.763682 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.763704 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.763722 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.867438 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.867511 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.867531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.867563 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.867583 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.941117 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:52 crc kubenswrapper[4878]: E1202 18:15:52.941316 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.970802 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.970847 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.970859 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.970878 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:52 crc kubenswrapper[4878]: I1202 18:15:52.970892 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:52Z","lastTransitionTime":"2025-12-02T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.073832 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.073872 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.073881 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.073896 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.073905 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.175851 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.175879 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.175888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.175901 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.175909 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.278442 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.278502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.278514 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.278531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.278546 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.360314 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/2.log" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.361167 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/1.log" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.364607 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5" exitCode=1 Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.364697 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.364766 4878 scope.go:117] "RemoveContainer" containerID="1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.365758 4878 scope.go:117] "RemoveContainer" containerID="1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5" Dec 02 18:15:53 crc kubenswrapper[4878]: E1202 18:15:53.365983 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.381327 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.381836 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.381895 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.381910 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.381932 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.381947 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.397441 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.411312 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.429053 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.443953 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.463921 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.479594 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.484711 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.484746 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.484755 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.484770 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.484780 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.494655 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.511415 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.529396 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.545212 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.560120 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.572475 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.587499 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.591884 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.591922 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.591933 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.591949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.591963 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.604101 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.626278 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.639226 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.660090 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:15:53Z is after 2025-08-24T17:21:41Z" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.694998 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.695130 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.695148 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.695575 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.695603 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.798533 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.798572 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.798585 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.798602 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.798614 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.902744 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.902869 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.902887 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.902914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.902933 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:53Z","lastTransitionTime":"2025-12-02T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.937704 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.937708 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:53 crc kubenswrapper[4878]: E1202 18:15:53.937926 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:53 crc kubenswrapper[4878]: I1202 18:15:53.937960 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:53 crc kubenswrapper[4878]: E1202 18:15:53.938287 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:53 crc kubenswrapper[4878]: E1202 18:15:53.938402 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.005824 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.005875 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.005894 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.005913 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.005925 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.108400 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.108482 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.108505 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.108535 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.108556 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.211140 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.211199 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.211218 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.211290 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.211339 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.313301 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.313346 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.313356 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.313369 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.313378 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.370399 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/2.log" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.416922 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.416967 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.416980 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.417001 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.417013 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.520290 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.520352 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.520374 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.520402 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.520424 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.623685 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.623720 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.623734 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.623750 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.623762 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.726748 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.726786 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.726796 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.726811 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.726821 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.830748 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.830780 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.830792 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.830810 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.830822 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.933455 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.933498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.933507 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.933519 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.933528 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:54Z","lastTransitionTime":"2025-12-02T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:54 crc kubenswrapper[4878]: I1202 18:15:54.936773 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:54 crc kubenswrapper[4878]: E1202 18:15:54.936921 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.036442 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.036506 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.036524 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.036549 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.036567 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.139384 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.139430 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.139446 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.139470 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.139488 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.242552 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.242683 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.242696 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.242717 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.242731 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.344808 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.344859 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.344870 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.344888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.344900 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.446923 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.446965 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.446978 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.446996 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.447007 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.550347 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.550415 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.550440 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.550471 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.550493 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.653744 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.654094 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.654310 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.654449 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.654572 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.758117 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.758177 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.758195 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.758296 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.758336 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.861491 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.861527 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.861536 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.861553 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.861563 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.937457 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:55 crc kubenswrapper[4878]: E1202 18:15:55.937610 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.937755 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:55 crc kubenswrapper[4878]: E1202 18:15:55.937797 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.937896 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:55 crc kubenswrapper[4878]: E1202 18:15:55.937941 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.963455 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.963520 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.963543 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.963567 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:55 crc kubenswrapper[4878]: I1202 18:15:55.963590 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:55Z","lastTransitionTime":"2025-12-02T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.067529 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.067606 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.067619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.067641 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.067654 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.169971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.169999 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.170013 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.170030 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.170044 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.272617 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.272714 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.272752 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.272783 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.272806 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.375574 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.375621 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.375630 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.375650 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.375661 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.478166 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.478205 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.478214 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.478229 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.478258 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.580844 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.580929 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.580943 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.580963 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.580975 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.683999 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.684050 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.684070 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.684096 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.684117 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.787052 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.787102 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.787114 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.787134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.787146 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.889794 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.889843 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.889852 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.889871 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.889883 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.937586 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:56 crc kubenswrapper[4878]: E1202 18:15:56.937796 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.992147 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.992185 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.992193 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.992206 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:56 crc kubenswrapper[4878]: I1202 18:15:56.992225 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:56Z","lastTransitionTime":"2025-12-02T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.094871 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.094921 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.094932 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.094949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.094963 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.197400 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.197491 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.197501 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.197515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.197525 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.300567 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.300627 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.300637 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.300658 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.300675 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.402850 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.402899 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.402912 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.402930 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.402943 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.505953 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.506001 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.506012 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.506031 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.506041 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.608817 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.608863 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.608876 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.608894 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.608905 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.711489 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.711531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.711540 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.711560 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.711570 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.814020 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.814055 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.814067 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.814083 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.814094 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.916781 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.916825 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.916835 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.916849 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.916858 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:57Z","lastTransitionTime":"2025-12-02T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.937291 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.937318 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:57 crc kubenswrapper[4878]: E1202 18:15:57.937679 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:57 crc kubenswrapper[4878]: E1202 18:15:57.937531 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:57 crc kubenswrapper[4878]: I1202 18:15:57.937380 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:57 crc kubenswrapper[4878]: E1202 18:15:57.937907 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.019282 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.019371 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.019389 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.019410 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.019425 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.122436 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.122731 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.122798 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.122963 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.123029 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.226494 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.226558 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.226571 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.226591 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.226609 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.329514 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.329562 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.329573 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.329589 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.329599 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.432205 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.432289 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.432301 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.432318 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.432328 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.535084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.535368 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.535469 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.535541 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.535598 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.638282 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.638338 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.638351 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.638383 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.638400 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.740702 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.740744 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.740757 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.740778 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.740797 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.843223 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.843279 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.843288 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.843305 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.843314 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.937771 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:15:58 crc kubenswrapper[4878]: E1202 18:15:58.938275 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.945586 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.945643 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.945653 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.945672 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:58 crc kubenswrapper[4878]: I1202 18:15:58.945682 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:58Z","lastTransitionTime":"2025-12-02T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.048602 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.048876 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.048981 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.049144 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.049206 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.152260 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.152328 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.152342 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.152364 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.152381 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.256458 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.257451 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.257674 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.257862 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.258010 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.361108 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.361173 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.361184 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.361204 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.361216 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.465602 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.465682 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.465696 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.465721 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.465738 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.569032 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.569095 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.569111 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.569136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.569154 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.671589 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.671643 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.671654 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.671671 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.671685 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.774823 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.775129 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.775224 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.775342 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.775483 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.878498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.878560 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.878587 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.878614 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.878630 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.937483 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.937554 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.937513 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:15:59 crc kubenswrapper[4878]: E1202 18:15:59.937640 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:15:59 crc kubenswrapper[4878]: E1202 18:15:59.937730 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:15:59 crc kubenswrapper[4878]: E1202 18:15:59.937799 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.981654 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.981693 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.981705 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.981722 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:15:59 crc kubenswrapper[4878]: I1202 18:15:59.981733 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:15:59Z","lastTransitionTime":"2025-12-02T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.083692 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.083740 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.083754 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.083774 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.083787 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.187134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.187171 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.187181 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.187200 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.187213 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.290424 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.291164 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.291334 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.291489 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.291609 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.394166 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.394221 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.394232 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.394277 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.394289 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.496502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.496553 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.496564 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.496583 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.496595 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.599450 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.599493 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.599505 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.599522 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.599535 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.701962 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.702019 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.702036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.702063 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.702079 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.731681 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.731724 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.731735 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.731752 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.731766 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: E1202 18:16:00.748377 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.752120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.752281 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.752361 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.752443 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.752519 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: E1202 18:16:00.771496 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.778010 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.778071 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.778092 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.778120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.778139 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: E1202 18:16:00.799157 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.804220 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.804327 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.804341 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.804367 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.804382 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: E1202 18:16:00.818741 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.824647 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.824681 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.824695 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.824716 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.824731 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: E1202 18:16:00.836623 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: E1202 18:16:00.836816 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.839170 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.839204 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.839215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.839235 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.839270 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.937702 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:00 crc kubenswrapper[4878]: E1202 18:16:00.937871 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.941795 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.941835 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.941849 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.941866 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.941879 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:00Z","lastTransitionTime":"2025-12-02T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.954881 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.970810 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.983647 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:00 crc kubenswrapper[4878]: I1202 18:16:00.997073 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:00Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.008293 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.018418 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.035299 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.045079 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.045114 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.045122 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.045140 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.045151 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.048726 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.060667 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.076911 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.088798 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.106530 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.117605 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.136639 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.147730 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.147769 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.147781 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.147797 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.147811 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.154083 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.166855 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.180622 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.193114 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:01Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.250591 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.250633 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.250642 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.250657 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.250667 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.304225 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:01 crc kubenswrapper[4878]: E1202 18:16:01.304399 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:16:01 crc kubenswrapper[4878]: E1202 18:16:01.304474 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:16:33.304455607 +0000 UTC m=+102.994074488 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.353614 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.353661 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.353674 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.353692 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.353707 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.456221 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.456289 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.456302 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.456318 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.456332 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.559858 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.559887 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.559897 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.559911 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.559921 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.662936 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.663019 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.663049 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.663085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.663111 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.765838 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.765880 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.765889 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.765909 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.765919 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.868294 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.868405 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.868430 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.868454 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.868472 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.937334 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.937334 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:01 crc kubenswrapper[4878]: E1202 18:16:01.937527 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:01 crc kubenswrapper[4878]: E1202 18:16:01.937608 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.937356 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:01 crc kubenswrapper[4878]: E1202 18:16:01.937811 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.971083 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.971112 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.971125 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.971141 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:01 crc kubenswrapper[4878]: I1202 18:16:01.971156 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:01Z","lastTransitionTime":"2025-12-02T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.074755 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.074804 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.074821 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.074844 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.074861 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.177874 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.177912 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.177925 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.177941 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.177952 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.280161 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.280220 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.280273 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.280304 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.280325 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.384377 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.384469 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.384487 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.384516 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.384539 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.486890 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.486943 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.486952 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.486971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.486985 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.590016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.590090 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.590111 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.590139 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.590162 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.693560 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.693646 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.693668 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.693705 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.693727 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.796730 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.796791 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.796808 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.796833 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.796850 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.900008 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.900067 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.900086 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.900110 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.900126 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:02Z","lastTransitionTime":"2025-12-02T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:02 crc kubenswrapper[4878]: I1202 18:16:02.937433 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:02 crc kubenswrapper[4878]: E1202 18:16:02.937682 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.002041 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.002092 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.002103 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.002121 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.002134 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.105054 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.105386 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.105411 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.105437 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.105458 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.243651 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.243719 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.244084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.244125 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.244145 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.346484 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.346520 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.346528 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.346544 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.346553 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.449195 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.449354 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.449375 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.449406 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.449428 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.552444 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.552488 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.552498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.552515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.552528 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.655990 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.656048 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.656060 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.656087 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.656101 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.759015 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.759079 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.759094 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.759120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.759139 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.862733 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.862784 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.862797 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.862823 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.862835 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.937115 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.937212 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:03 crc kubenswrapper[4878]: E1202 18:16:03.937364 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.937420 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:03 crc kubenswrapper[4878]: E1202 18:16:03.937615 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:03 crc kubenswrapper[4878]: E1202 18:16:03.937855 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.965902 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.965972 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.965987 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.966011 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:03 crc kubenswrapper[4878]: I1202 18:16:03.966027 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:03Z","lastTransitionTime":"2025-12-02T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.068947 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.068993 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.069004 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.069036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.069045 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.172179 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.172252 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.172282 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.172299 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.172313 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.275615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.275667 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.275678 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.275696 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.275707 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.379739 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.379779 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.379790 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.379807 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.379816 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.406842 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/0.log" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.406937 4878 generic.go:334] "Generic (PLEG): container finished" podID="e79a8cec-20ba-4862-ba25-7de014466668" containerID="e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7" exitCode=1 Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.406990 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerDied","Data":"e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.407751 4878 scope.go:117] "RemoveContainer" containerID="e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.435812 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.452406 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.465526 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.478511 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.482360 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.482397 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.482411 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.482432 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.482445 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.494520 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.508358 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.525680 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.539839 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.557343 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.569267 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.580344 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.584759 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.584815 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.584828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.584850 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.584861 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.594521 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.606217 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.618125 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.629064 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.650678 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.664975 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.682549 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:04Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.687104 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.687125 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.687136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.687171 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.687183 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.791460 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.791550 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.791570 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.791617 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.791638 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.895583 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.895641 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.895651 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.895683 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.895697 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.938032 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:04 crc kubenswrapper[4878]: E1202 18:16:04.938278 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.999471 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.999526 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.999539 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:04 crc kubenswrapper[4878]: I1202 18:16:04.999564 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:04.999578 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:04Z","lastTransitionTime":"2025-12-02T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.103217 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.103310 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.103337 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.103363 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.103379 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.205753 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.205842 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.205854 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.205870 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.205879 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.308907 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.308976 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.308989 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.309013 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.309027 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.410961 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.411004 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.411014 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.411032 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.411044 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.413477 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/0.log" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.413546 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerStarted","Data":"e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.430797 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.446014 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.467013 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.482932 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.506536 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e3634e1c0b01425ea940aa0a929055f87791cb9db1139effbd451f4f13f1289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 18:15:27.078343 6344 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 18:15:27.078373 6344 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 18:15:27.078382 6344 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.514196 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.514292 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.514311 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.514656 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.514679 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.523903 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.539587 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.551295 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.560098 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.617568 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.617599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.617609 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.617627 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.617638 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.619344 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.639031 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.652339 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.665216 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.677963 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.692608 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.708020 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.719650 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.720085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.720113 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.720121 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.720136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.720146 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.732412 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:05Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.822858 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.822905 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.822915 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.822933 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.822945 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.925872 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.925906 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.925918 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.925938 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.925950 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:05Z","lastTransitionTime":"2025-12-02T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.936835 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:05 crc kubenswrapper[4878]: E1202 18:16:05.936990 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.937231 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:05 crc kubenswrapper[4878]: E1202 18:16:05.937368 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:05 crc kubenswrapper[4878]: I1202 18:16:05.937530 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:05 crc kubenswrapper[4878]: E1202 18:16:05.937599 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.028914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.028957 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.028971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.028989 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.029001 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.131208 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.131267 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.131279 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.131300 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.131313 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.235916 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.235950 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.235959 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.235971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.235980 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.338830 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.338867 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.338878 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.338897 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.338909 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.442228 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.442362 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.442387 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.442413 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.442431 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.545835 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.545897 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.545915 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.545946 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.545968 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.649949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.650012 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.650032 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.650056 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.650074 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.753076 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.753134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.753150 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.753172 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.753189 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.856430 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.856493 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.856509 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.856531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.856546 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.936931 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:06 crc kubenswrapper[4878]: E1202 18:16:06.937135 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.959717 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.959756 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.959768 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.959782 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:06 crc kubenswrapper[4878]: I1202 18:16:06.959795 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:06Z","lastTransitionTime":"2025-12-02T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.063443 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.063503 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.063519 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.063542 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.063558 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.167526 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.167598 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.167619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.167641 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.167658 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.271086 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.271176 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.271197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.271223 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.271293 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.374619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.374677 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.374695 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.374720 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.374738 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.477766 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.477830 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.477850 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.477880 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.477901 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.581418 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.581486 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.581508 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.581536 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.581558 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.684110 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.684163 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.684180 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.684207 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.684224 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.787208 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.787307 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.787325 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.787353 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.787370 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.890052 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.890103 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.890120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.890142 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.890158 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:07Z","lastTransitionTime":"2025-12-02T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.937644 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.937753 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:07 crc kubenswrapper[4878]: E1202 18:16:07.937842 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.937867 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:07 crc kubenswrapper[4878]: E1202 18:16:07.938053 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:07 crc kubenswrapper[4878]: E1202 18:16:07.938717 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.941802 4878 scope.go:117] "RemoveContainer" containerID="1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5" Dec 02 18:16:07 crc kubenswrapper[4878]: E1202 18:16:07.943656 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.960097 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:07Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:07 crc kubenswrapper[4878]: I1202 18:16:07.985780 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:07Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.006377 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.006425 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.006437 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.006454 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.006464 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.007630 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.025095 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.050069 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.066449 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.082473 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.104888 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.109389 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.109420 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.109433 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.109450 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.109462 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.117973 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.131713 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.146692 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.166166 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.183551 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.197701 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.212441 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.212496 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.212514 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.212543 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.212563 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.216176 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.231877 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.247656 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.262086 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:08Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.315323 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.315405 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.315423 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.315822 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.315873 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.419006 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.419073 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.419090 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.419120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.419139 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.522367 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.522426 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.522444 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.522467 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.522484 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.625490 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.625532 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.625541 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.625554 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.625562 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.729346 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.729415 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.729436 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.729464 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.729482 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.833205 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.833285 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.833303 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.833327 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.833346 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.936961 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:08 crc kubenswrapper[4878]: E1202 18:16:08.937219 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.937262 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.937307 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.937325 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.937346 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:08 crc kubenswrapper[4878]: I1202 18:16:08.937363 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:08Z","lastTransitionTime":"2025-12-02T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.040380 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.040455 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.040475 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.040504 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.040524 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.145168 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.145282 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.145307 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.145328 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.145341 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.248539 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.248637 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.248652 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.248671 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.248685 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.351065 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.351120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.351132 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.351152 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.351164 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.454392 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.454472 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.454496 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.454525 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.454548 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.557886 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.557948 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.557964 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.557991 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.558010 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.661781 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.661871 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.661894 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.661929 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.661951 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.764871 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.764935 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.764951 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.764976 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.764993 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.867614 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.867690 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.867709 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.867736 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.867758 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.936771 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.936823 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:09 crc kubenswrapper[4878]: E1202 18:16:09.936986 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.937031 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:09 crc kubenswrapper[4878]: E1202 18:16:09.937282 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:09 crc kubenswrapper[4878]: E1202 18:16:09.937420 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.970759 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.970837 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.970856 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.970888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:09 crc kubenswrapper[4878]: I1202 18:16:09.970916 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:09Z","lastTransitionTime":"2025-12-02T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.074479 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.074599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.074697 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.074791 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.074812 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.178164 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.178230 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.178317 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.178345 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.178363 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.281073 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.281141 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.281164 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.281195 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.281219 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.384661 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.384714 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.384724 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.384744 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.384758 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.488981 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.489040 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.489057 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.489080 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.489093 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.591903 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.591965 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.591984 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.592036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.592054 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.694716 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.694761 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.694771 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.694792 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.694802 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.797199 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.797337 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.797358 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.797390 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.797408 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.901104 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.901143 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.901154 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.901170 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.901183 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:10Z","lastTransitionTime":"2025-12-02T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.937504 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:10 crc kubenswrapper[4878]: E1202 18:16:10.938204 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.956958 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:10Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.976230 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:10Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:10 crc kubenswrapper[4878]: I1202 18:16:10.994657 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:10Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.004558 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.004641 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.004659 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.005326 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.005361 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.013134 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.027409 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.041739 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.054651 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.068628 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.081941 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.085959 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.086059 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.086071 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.086091 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.086103 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.101408 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.108914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.109064 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.109084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.109108 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.109129 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.115601 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.124333 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.128617 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.128674 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.128687 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.128712 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.128724 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.132406 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.141590 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.146515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.146570 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.146581 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.146604 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.146617 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.153090 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.158606 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.162160 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.162192 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.162201 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.162218 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.162229 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.170398 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.174415 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.174589 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.176495 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.176537 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.176551 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.176572 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.176584 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.190162 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.205365 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.219270 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.233586 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.245429 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:11Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.280276 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.280317 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.280329 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.280351 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.280368 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.383120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.383163 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.383175 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.383197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.383213 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.485401 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.485447 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.485462 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.485484 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.485500 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.588017 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.588054 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.588066 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.588085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.588098 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.690539 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.690590 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.690608 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.690631 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.690651 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.792803 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.792846 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.792856 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.792869 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.792878 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.896572 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.896648 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.896668 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.896693 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.896713 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.937474 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.937549 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.937608 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.937747 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.937861 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:11 crc kubenswrapper[4878]: E1202 18:16:11.937970 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.998153 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.998181 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.998190 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.998204 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:11 crc kubenswrapper[4878]: I1202 18:16:11.998214 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:11Z","lastTransitionTime":"2025-12-02T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.101811 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.101869 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.101887 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.101915 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.101933 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.206985 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.207062 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.207080 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.207107 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.207124 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.310762 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.310840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.310859 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.310884 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.310903 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.413826 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.413907 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.413925 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.413952 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.413970 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.517981 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.518084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.518099 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.518122 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.518135 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.620497 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.620565 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.620579 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.620601 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.620617 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.723701 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.723756 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.723770 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.723787 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.723801 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.827394 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.827476 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.827499 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.827531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.827550 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.930826 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.930877 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.930888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.930906 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.930918 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:12Z","lastTransitionTime":"2025-12-02T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:12 crc kubenswrapper[4878]: I1202 18:16:12.936832 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:12 crc kubenswrapper[4878]: E1202 18:16:12.936988 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.033474 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.033540 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.033577 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.033601 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.033618 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.136413 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.136466 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.136491 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.136515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.136532 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.239586 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.239693 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.239732 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.239763 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.239784 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.342642 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.342709 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.342726 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.342752 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.342771 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.445131 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.445184 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.445195 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.445215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.445227 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.548105 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.548173 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.548192 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.548219 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.548267 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.651809 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.651875 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.651895 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.651920 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.651938 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.754382 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.754478 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.754503 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.754536 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.754559 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.856940 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.856979 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.856988 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.857002 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.857015 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.937705 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.937836 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.937881 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:13 crc kubenswrapper[4878]: E1202 18:16:13.938070 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:13 crc kubenswrapper[4878]: E1202 18:16:13.938216 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:13 crc kubenswrapper[4878]: E1202 18:16:13.938419 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.960601 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.960651 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.960663 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.960682 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:13 crc kubenswrapper[4878]: I1202 18:16:13.960696 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:13Z","lastTransitionTime":"2025-12-02T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.062913 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.063006 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.063024 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.063049 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.063065 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.166008 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.166084 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.166104 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.166132 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.166151 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.270174 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.270252 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.270266 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.270287 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.270302 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.373227 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.373296 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.373308 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.373327 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.373341 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.476727 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.476806 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.476821 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.476840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.476852 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.580110 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.580204 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.580219 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.580260 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.580276 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.683586 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.683645 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.683656 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.683674 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.683686 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.786478 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.786530 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.786540 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.786555 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.786564 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.889711 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.889798 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.889811 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.889833 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.889846 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.937032 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:14 crc kubenswrapper[4878]: E1202 18:16:14.937264 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.992198 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.992257 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.992266 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.992283 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:14 crc kubenswrapper[4878]: I1202 18:16:14.992293 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:14Z","lastTransitionTime":"2025-12-02T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.095347 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.095381 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.095391 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.095429 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.095442 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.198840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.198896 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.198915 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.198938 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.198955 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.302775 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.302860 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.302874 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.302894 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.302906 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.406747 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.406813 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.406839 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.406874 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.406900 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.510863 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.510945 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.510967 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.510997 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.511021 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.614636 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.614694 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.614706 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.614722 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.614735 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.717736 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.717804 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.717829 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.717862 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.717887 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.820573 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.820638 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.820651 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.820670 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.820681 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.923714 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.923764 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.923777 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.923797 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.923813 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:15Z","lastTransitionTime":"2025-12-02T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.937015 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.937118 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:15 crc kubenswrapper[4878]: I1202 18:16:15.937165 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:15 crc kubenswrapper[4878]: E1202 18:16:15.937359 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:15 crc kubenswrapper[4878]: E1202 18:16:15.937461 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:15 crc kubenswrapper[4878]: E1202 18:16:15.937554 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.026446 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.026500 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.026511 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.026531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.026543 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.128683 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.128725 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.128734 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.128749 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.128758 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.233130 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.233179 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.233192 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.233214 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.233228 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.336593 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.336646 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.336657 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.336680 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.336693 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.439778 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.439845 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.439874 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.439899 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.439920 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.542688 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.542781 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.542798 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.542821 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.542838 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.650742 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.650810 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.650828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.650879 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.650898 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.754608 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.754670 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.754682 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.754706 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.754719 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.857833 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.857888 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.857905 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.857932 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.857952 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.937556 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:16 crc kubenswrapper[4878]: E1202 18:16:16.937723 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.960435 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.960502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.960528 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.960558 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:16 crc kubenswrapper[4878]: I1202 18:16:16.960584 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:16Z","lastTransitionTime":"2025-12-02T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.063974 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.064032 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.064047 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.064066 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.064080 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.166933 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.166994 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.167010 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.167029 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.167039 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.270707 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.270762 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.270773 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.270795 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.270813 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.374063 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.374113 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.374127 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.374150 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.374163 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.477544 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.477602 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.477620 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.477651 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.477671 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.580790 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.580848 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.580861 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.580889 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.580906 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.683738 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.683796 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.683810 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.683831 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.683849 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.786662 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.786710 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.786723 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.786744 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.786759 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.792186 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.792376 4878 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.792440 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.792421748 +0000 UTC m=+151.482040629 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.889714 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.889786 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.889809 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.889836 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.889853 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.893113 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.893255 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.893288 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.893324 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893426 4878 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893461 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.893423476 +0000 UTC m=+151.583042387 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893506 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.893483298 +0000 UTC m=+151.583102339 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893563 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893575 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893634 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893660 4878 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893584 4878 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893710 4878 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893778 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.893754668 +0000 UTC m=+151.583373769 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.893817 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.893801459 +0000 UTC m=+151.583420560 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.937636 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.937684 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.937636 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.937853 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.938103 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:17 crc kubenswrapper[4878]: E1202 18:16:17.938183 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.993028 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.993085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.993100 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.993125 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:17 crc kubenswrapper[4878]: I1202 18:16:17.993145 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:17Z","lastTransitionTime":"2025-12-02T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.096289 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.096371 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.096384 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.096407 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.096422 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.199615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.199684 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.199703 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.199728 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.199767 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.302705 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.302767 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.302778 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.302864 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.302882 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.406493 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.406541 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.406553 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.406572 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.406586 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.509351 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.509436 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.509461 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.509486 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.509505 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.612086 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.612157 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.612182 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.612213 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.612271 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.714683 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.714721 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.714732 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.714752 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.714767 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.823201 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.823313 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.823333 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.823359 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.823376 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.926358 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.926424 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.926442 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.926467 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.926484 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:18Z","lastTransitionTime":"2025-12-02T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:18 crc kubenswrapper[4878]: I1202 18:16:18.937797 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:18 crc kubenswrapper[4878]: E1202 18:16:18.937963 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.028792 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.028847 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.028871 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.028894 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.028910 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.131807 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.131840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.131849 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.131862 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.131871 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.236012 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.236093 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.236110 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.236136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.236154 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.340152 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.340208 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.340227 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.340288 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.340323 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.443986 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.444311 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.444320 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.444337 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.444346 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.547949 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.548022 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.548047 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.548083 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.548108 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.651107 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.651175 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.651194 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.651224 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.651271 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.754310 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.754383 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.754394 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.754410 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.754420 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.856880 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.856916 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.856927 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.856943 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.856955 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.938502 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.938595 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.938641 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.938671 4878 scope.go:117] "RemoveContainer" containerID="1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5" Dec 02 18:16:19 crc kubenswrapper[4878]: E1202 18:16:19.938824 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:19 crc kubenswrapper[4878]: E1202 18:16:19.939004 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:19 crc kubenswrapper[4878]: E1202 18:16:19.939202 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.960228 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.960297 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.960315 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.960339 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:19 crc kubenswrapper[4878]: I1202 18:16:19.960356 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:19Z","lastTransitionTime":"2025-12-02T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.063012 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.063054 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.063068 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.063085 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.063099 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.166963 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.167045 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.167068 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.167099 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.167124 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.270060 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.270106 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.270116 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.270133 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.270144 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.373653 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.373701 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.373712 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.373732 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.373745 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.467576 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/2.log" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.470597 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.471095 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.475962 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.476018 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.476037 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.476060 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.476088 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.488619 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.504198 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.530176 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.553391 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.578714 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.578774 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.578786 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.578807 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.578819 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.579021 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.594775 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.608270 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.620203 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.630708 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.647462 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.664041 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.679772 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.681747 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.681798 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.681812 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.681829 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.681841 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.697617 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.710793 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.724605 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.740753 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.758160 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.775992 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.785424 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.785468 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.785481 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.785498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.785510 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.887531 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.887590 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.887603 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.887623 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.887640 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.937533 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:20 crc kubenswrapper[4878]: E1202 18:16:20.937692 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.949040 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.962776 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.980517 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.989914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.989946 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.989957 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.989973 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.989985 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:20Z","lastTransitionTime":"2025-12-02T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:20 crc kubenswrapper[4878]: I1202 18:16:20.995173 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:20Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.006912 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.018720 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.032669 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.043945 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.059552 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.076155 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.092415 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.092461 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.092473 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.092491 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.092504 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.094706 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.121284 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.141160 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.172958 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.190455 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.195899 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.195950 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.195961 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.195979 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.195991 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.213057 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.233364 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.253131 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.298903 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.298951 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.298963 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.298982 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.298997 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.401483 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.401554 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.401567 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.401611 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.401624 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.460263 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.460344 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.460356 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.460373 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.460383 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.475703 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/3.log" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.476372 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/2.log" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.480028 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01" exitCode=1 Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.480075 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.480147 4878 scope.go:117] "RemoveContainer" containerID="1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5" Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.480378 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.482860 4878 scope.go:117] "RemoveContainer" containerID="91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01" Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.483124 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.489834 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.489871 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.489884 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.489900 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.489910 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.503789 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.508651 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.509745 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.509782 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.509794 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.509812 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.509827 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.522725 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.526143 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.528436 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.528492 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.528510 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.528534 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.528556 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.546738 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.546768 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d628ea5ed2cf182cb69068c36259e502ca31a1f38ec98e28ab08f2e25f8a5f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:15:52Z\\\",\\\"message\\\":\\\" 18:15:52.125384 6607 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1202 18:15:52.124969 6607 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}\\\\nI1202 18:15:52.125392 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1202 18:15:52.125397 6607 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1202 18:15:52.125157 6607 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-fnpmk\\\\nI1202 18:15:52.125404 6607 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 1.341744ms\\\\nI1202 18:15:52.125443 6607 services_controller.go:356] Processing sync for service default/openshift for network=default\\\\nI1202 18:15:52.125456 6607 services_controller.go:360] Finished syncing service openshift on namespace default for network=default : 12.93µs\\\\nI1202 18:15:52.125465 6607 services_controller.go:356] Processing sync for service openshift-network-console/networking-console-plugin for n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"bernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:16:20.806560 7018 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.806782 7018 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.806943 7018 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.807171 7018 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.807318 7018 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:16:20.807653 7018 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.807857 7018 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:16:20.807983 7018 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.808420 7018 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.550704 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.550753 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.550765 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.550783 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.550795 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.563633 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.566220 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"eec7cc2e-918f-4f16-92ea-f02d5b5d5466\\\",\\\"systemUUID\\\":\\\"9efa06d4-630a-45f6-aefd-96e578b112dc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.566385 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.568567 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.568624 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.568634 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.568651 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.568662 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.582757 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.602877 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.619835 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.639284 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.653224 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.670635 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.671898 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.671961 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.671985 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.672015 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.672037 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.690337 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.707841 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.726120 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.741268 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.765174 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.775905 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.775971 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.775996 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.776030 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.776053 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.781051 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.801522 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.824844 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:21Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.878882 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.878953 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.878972 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.878998 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.879016 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.937204 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.937333 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.937355 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.937626 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.937699 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:21 crc kubenswrapper[4878]: E1202 18:16:21.937799 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.960329 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.981152 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.981198 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.981208 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.981228 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:21 crc kubenswrapper[4878]: I1202 18:16:21.981258 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:21Z","lastTransitionTime":"2025-12-02T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.084554 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.084624 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.084642 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.084672 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.084689 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.188306 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.188368 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.188384 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.188410 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.188424 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.291800 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.291861 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.291873 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.291901 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.291918 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.395291 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.395364 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.395382 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.395411 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.395429 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.486773 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/3.log" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.492307 4878 scope.go:117] "RemoveContainer" containerID="91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01" Dec 02 18:16:22 crc kubenswrapper[4878]: E1202 18:16:22.492534 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.498462 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.498736 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.498964 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.499142 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.499373 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.512172 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.530558 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6ktxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5653c799-2a0f-4f9e-b719-ffb2642d1207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbba864a38152e12c879c2eb771025bcdad6030cfa5572e396aadee7c508488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hvsq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6ktxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.544648 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52cbe290-1952-424f-abe1-000eb8c3efd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://846bca8513947813249de7f7ae6bab2da477bde17446a4b3714334f57895f121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://974294eb7de37152669af70564facc3363b144bbafbae6cc870ade3c14d7bb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://974294eb7de37152669af70564facc3363b144bbafbae6cc870ade3c14d7bb77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.559858 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 18:15:06.566207 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 18:15:06.567495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2989273037/tls.crt::/tmp/serving-cert-2989273037/tls.key\\\\\\\"\\\\nI1202 18:15:13.019653 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 18:15:13.022765 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 18:15:13.022786 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 18:15:13.022812 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 18:15:13.022829 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 18:15:13.030458 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 18:15:13.030491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030496 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 18:15:13.030503 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 18:15:13.030528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 18:15:13.030533 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 18:15:13.030536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 18:15:13.030536 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 18:15:13.032642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.573732 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d236cea-cb62-4f00-aeec-1667a3de5118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c85ca57d995e9ffb93b2985064dfaacb2e40fbfd91a43facef1bf864b1d5d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180a0fe0f8e58dd223d1fb79b13e121175003f1658276dfb8182293dc67cfd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14dd0228ee9bc7129dabf56c5d37c7ac0580486a2708da30d6e0bc61c255abfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99179b62cfc76c394572b4008ffd5bf32ce3f6f38e6e655cbbbfc20dab8e83bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.588556 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"723bfeea-9234-4d2a-8492-747dc974d044\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396c71822f8200a51ece8c86d497d3aecaa56d3d026cbb561866ef6d1d563945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d95kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-npvcg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.602396 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9jvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11a20f23-e2bd-4df6-a47f-73b37f11cd8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://887cabde45575dd6eb33a3ef6db5df2053e2b0b7f2f803568f16e11137a73a34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28kvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9jvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.602708 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.602747 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.602757 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.602774 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.602786 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.625430 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c256c29c-e637-409f-a7b8-42db159198d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4efd838f57de305d25ec6ff5e3135d2b7dc33808035ab2447a8702ae10c7953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac4dcd2e70260385c237b23b5ac50f2d8ee6b32588dd18e50b5b8bec1ec0460b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://955da46b789bba9484b6304c2e416b7f6093f7d9e2b344f5807a1b1dc1be8451\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b053de7267bfcc936b8e6aff6c79792fc721f748a29f41957f7709d244a10c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36235a7cc0beea186ebb981119128efe3b1595c644974ac6c655bdd0378aa5ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19109fa2abf61527200c2aea7ee0d6aff51fa2bf9153d59522e505d2f5f81849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9aeb5ec4165b215f3ab92ff0e7267bcf9a90d7ec2ac3ae30e8a9c43a354bc4a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmkz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fnpmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.644422 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6cm9t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e79a8cec-20ba-4862-ba25-7de014466668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:16:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:03Z\\\",\\\"message\\\":\\\"2025-12-02T18:15:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b\\\\n2025-12-02T18:15:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61c0c50e-aff8-4323-bd04-2d4a00a9593b to /host/opt/cni/bin/\\\\n2025-12-02T18:15:18Z [verbose] multus-daemon started\\\\n2025-12-02T18:15:18Z [verbose] Readiness Indicator file check\\\\n2025-12-02T18:16:03Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:16:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wsv9h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6cm9t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.659605 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33175bd6-9016-4c27-a8e6-d96f75e9187c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://261ba12597a9d2c536a581ebb211573ff8a502bacefbeb7510d52075e6392b56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50cf37f029d8f77c6ef633b7badd2df76b31de67a881ee85c84bd1d4bc92e240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wm9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gndfz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.676852 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14ab3bc1-6725-4861-968a-4af6260e0665\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afb6bbb0d97a33f43d6fe580332c5746d12064e03b4b21c1473091c15d38ef05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49cb99e326e18368d13e888aaa2fa25465856ff96b26c045fbf5158b3f6a4d31\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62146c3ef9680229b98f44aefe350d6a43acaa30a4aa4aa9390a9681fc73fca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.692767 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83f7c205ad0f279c3999408441e5e294ce891a5e5b875c4625243fede4e532e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.707664 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.707713 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.707726 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.707745 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.707754 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.711073 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.726053 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09adc15b-14dd-4a05-b569-4168b9ced169\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czrs2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:29Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-dlwt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.742455 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.759967 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2e4a7d01bf8d7314848b18ad9902aab715beda15d0bc253f5d4ca38c9219083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de71fd3814ecda71660587ec5188ee5689158414d85bb5ca09b81033e574a8b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.792878 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab2c5547-2ad1-47f7-90c3-9dee4b8f349f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca339bc6e5c6f8f1a93b2a6e1528d52a5e8cf1dcd3f1526c55e26007610acfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458b2d5a3f10be4b4e014d17875f752f3c24a4111a417a7b5a4b8ae33ae1e62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a8f69bfc3a11e286080590e9d3e42eb05fcb8d9a7a00915d84dfbbcd7b44be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f058ce5e75b8192da69c0393c2db1f07832afbe497109c3b3b2364fd6d19e15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06138c14ba7bee7f90727fd4ba85194c928c45cd687bc58e35993b34af540d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9786d008af293ccee515a85beb87e21c96fb2d879ff0b08fd619dd44aed2f9b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37498620e5a2138615f49a082913cd9a5f6ae31eb162bc088cf42a9b2e99bd64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a18b8e34ffa9d34b21c64d4a1f944a5bc59526dc82376fe9da46bd4677cceac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:14:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:14:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.809981 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.810024 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.810036 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.810053 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.810067 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.810640 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6be9097e0d9623962ef035a4f111da455338395eb8558bea8bedd1d1d72ec4ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.835493 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d160cfa4-9e2a-429d-b760-0cac6d467b9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T18:15:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T18:16:21Z\\\",\\\"message\\\":\\\"bernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:16:20.806560 7018 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.806782 7018 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.806943 7018 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.807171 7018 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.807318 7018 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:16:20.807653 7018 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.807857 7018 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 18:16:20.807983 7018 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 18:16:20.808420 7018 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T18:16:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:15:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T18:15:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T18:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fzdfp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T18:15:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-x5jzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T18:16:22Z is after 2025-08-24T17:21:41Z" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.912101 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.912172 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.912197 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.912226 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.912275 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:22Z","lastTransitionTime":"2025-12-02T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:22 crc kubenswrapper[4878]: I1202 18:16:22.936938 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:22 crc kubenswrapper[4878]: E1202 18:16:22.937155 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.014802 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.014852 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.014865 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.014882 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.014894 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.117898 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.117962 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.117988 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.118011 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.118026 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.220555 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.220617 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.220633 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.220652 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.220663 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.323489 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.323557 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.323585 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.323614 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.323636 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.427563 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.427620 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.427634 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.427655 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.427669 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.530119 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.530155 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.530164 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.530178 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.530186 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.633770 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.633828 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.633841 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.633860 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.633874 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.737294 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.737361 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.737384 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.737418 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.737443 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.841182 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.841320 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.841345 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.841375 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.841397 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.936980 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.937047 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.937064 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:23 crc kubenswrapper[4878]: E1202 18:16:23.937201 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:23 crc kubenswrapper[4878]: E1202 18:16:23.937342 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:23 crc kubenswrapper[4878]: E1202 18:16:23.937487 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.944646 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.944758 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.944777 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.944803 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:23 crc kubenswrapper[4878]: I1202 18:16:23.944819 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:23Z","lastTransitionTime":"2025-12-02T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.048212 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.048317 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.048336 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.048370 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.048391 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.152381 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.152455 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.152474 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.152500 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.152520 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.255758 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.255805 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.255819 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.255840 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.255852 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.359213 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.359360 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.359385 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.359415 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.359437 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.462705 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.462778 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.462801 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.462834 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.462856 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.565608 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.565656 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.565666 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.565680 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.565692 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.669443 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.669516 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.669542 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.669571 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.669594 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.772305 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.772367 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.772390 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.772419 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.772443 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.875011 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.875074 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.875093 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.875116 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.875135 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.938680 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:24 crc kubenswrapper[4878]: E1202 18:16:24.939633 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.977743 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.977804 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.977822 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.977847 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:24 crc kubenswrapper[4878]: I1202 18:16:24.977866 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:24Z","lastTransitionTime":"2025-12-02T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.080709 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.080753 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.080765 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.080782 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.080794 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.184698 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.184756 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.184777 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.184801 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.184819 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.287580 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.287624 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.287635 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.287680 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.287690 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.391044 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.391133 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.391160 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.391184 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.391202 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.494538 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.494593 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.494610 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.494635 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.494654 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.597955 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.598016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.598039 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.598075 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.598100 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.702042 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.702103 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.702119 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.702139 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.702149 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.804068 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.804110 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.804120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.804133 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.804144 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.906598 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.906644 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.906653 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.906666 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.906676 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:25Z","lastTransitionTime":"2025-12-02T18:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.937782 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.937838 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:25 crc kubenswrapper[4878]: E1202 18:16:25.938008 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:25 crc kubenswrapper[4878]: E1202 18:16:25.938147 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:25 crc kubenswrapper[4878]: I1202 18:16:25.938218 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:25 crc kubenswrapper[4878]: E1202 18:16:25.938370 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.009565 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.009699 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.009739 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.009763 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.009779 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.113010 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.113065 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.113077 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.113097 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.113115 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.215688 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.215742 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.215760 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.215783 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.215804 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.318903 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.318973 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.318989 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.319016 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.319034 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.422279 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.422328 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.422346 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.422369 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.422385 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.526074 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.526282 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.526315 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.526348 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.526368 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.629815 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.629878 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.629893 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.629915 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.629931 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.732995 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.733454 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.733593 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.733728 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.733856 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.836586 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.836683 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.836708 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.836738 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.836757 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.937561 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:26 crc kubenswrapper[4878]: E1202 18:16:26.937805 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.939628 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.939676 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.939692 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.939712 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:26 crc kubenswrapper[4878]: I1202 18:16:26.939726 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:26Z","lastTransitionTime":"2025-12-02T18:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.042317 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.042694 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.042779 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.042861 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.042945 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.146090 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.146156 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.146173 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.146198 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.146215 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.249437 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.249562 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.249581 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.249606 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.249623 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.352599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.352663 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.352684 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.352715 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.352740 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.456421 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.456774 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.456851 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.456988 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.457056 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.559502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.559889 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.559976 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.560054 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.560149 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.663502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.663577 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.663591 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.663615 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.663624 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.766134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.766202 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.766218 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.766266 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.766285 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.869848 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.869914 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.869929 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.869953 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.869969 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.937106 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.937178 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.937482 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:27 crc kubenswrapper[4878]: E1202 18:16:27.938032 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:27 crc kubenswrapper[4878]: E1202 18:16:27.938072 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:27 crc kubenswrapper[4878]: E1202 18:16:27.938226 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.972799 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.973158 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.973409 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.973628 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:27 crc kubenswrapper[4878]: I1202 18:16:27.973823 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:27Z","lastTransitionTime":"2025-12-02T18:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.077041 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.077094 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.077104 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.077120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.077132 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.180337 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.180402 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.180414 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.180433 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.180446 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.283512 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.283782 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.283820 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.283851 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.283872 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.421002 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.421058 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.421073 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.421097 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.421113 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.522936 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.522975 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.522985 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.523001 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.523012 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.624872 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.624924 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.624935 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.624953 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.624965 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.727644 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.727705 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.727723 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.727744 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.727766 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.830829 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.830887 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.830905 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.830930 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.830948 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.934442 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.934499 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.934515 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.934541 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.934561 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:28Z","lastTransitionTime":"2025-12-02T18:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:28 crc kubenswrapper[4878]: I1202 18:16:28.937841 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:28 crc kubenswrapper[4878]: E1202 18:16:28.938028 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.038729 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.038790 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.038808 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.038832 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.038851 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.145578 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.145672 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.145692 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.145721 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.145746 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.249387 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.249453 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.249472 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.249498 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.249519 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.352326 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.352391 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.352426 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.352451 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.352469 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.455542 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.455599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.455619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.455642 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.455659 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.558773 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.558833 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.558850 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.558874 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.558891 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.660895 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.660933 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.660948 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.660968 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.660983 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.764450 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.764502 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.764514 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.764533 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.764546 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.866973 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.867048 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.867068 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.867099 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.867119 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.937047 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.937165 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.937063 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:29 crc kubenswrapper[4878]: E1202 18:16:29.937313 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:29 crc kubenswrapper[4878]: E1202 18:16:29.937395 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:29 crc kubenswrapper[4878]: E1202 18:16:29.937582 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.971092 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.971134 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.971150 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.971171 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:29 crc kubenswrapper[4878]: I1202 18:16:29.971190 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:29Z","lastTransitionTime":"2025-12-02T18:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.074789 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.074850 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.074870 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.074893 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.074910 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.178136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.178212 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.178270 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.178305 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.178328 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.282005 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.282071 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.282090 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.282119 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.282137 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.385684 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.385729 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.385747 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.385772 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.385790 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.489499 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.489630 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.489694 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.489722 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.489840 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.593132 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.593213 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.593275 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.593313 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.593337 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.696737 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.696775 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.696787 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.696804 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.696815 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.800215 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.800329 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.800355 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.800390 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.800414 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.903186 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.903274 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.903298 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.903325 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.903347 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:30Z","lastTransitionTime":"2025-12-02T18:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.937158 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:30 crc kubenswrapper[4878]: E1202 18:16:30.937346 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:30 crc kubenswrapper[4878]: I1202 18:16:30.965930 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=47.965907654 podStartE2EDuration="47.965907654s" podCreationTimestamp="2025-12-02 18:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:30.965665286 +0000 UTC m=+100.655284177" watchObservedRunningTime="2025-12-02 18:16:30.965907654 +0000 UTC m=+100.655526575" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.006571 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.006599 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.006607 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.006619 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.006627 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.007831 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6ktxv" podStartSLOduration=77.007821191 podStartE2EDuration="1m17.007821191s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:30.994683094 +0000 UTC m=+100.684301985" watchObservedRunningTime="2025-12-02 18:16:31.007821191 +0000 UTC m=+100.697440072" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.008063 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.008059139 podStartE2EDuration="10.008059139s" podCreationTimestamp="2025-12-02 18:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.007371806 +0000 UTC m=+100.696990687" watchObservedRunningTime="2025-12-02 18:16:31.008059139 +0000 UTC m=+100.697678020" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.027927 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.027908815 podStartE2EDuration="1m17.027908815s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.02775083 +0000 UTC m=+100.717369721" watchObservedRunningTime="2025-12-02 18:16:31.027908815 +0000 UTC m=+100.717527696" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.057952 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podStartSLOduration=77.057933057 podStartE2EDuration="1m17.057933057s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.057767292 +0000 UTC m=+100.747386183" watchObservedRunningTime="2025-12-02 18:16:31.057933057 +0000 UTC m=+100.747551928" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.092994 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fnpmk" podStartSLOduration=77.09296842 podStartE2EDuration="1m17.09296842s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.092953899 +0000 UTC m=+100.782572790" watchObservedRunningTime="2025-12-02 18:16:31.09296842 +0000 UTC m=+100.782587311" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.093476 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p9jvp" podStartSLOduration=77.093469007 podStartE2EDuration="1m17.093469007s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.073071863 +0000 UTC m=+100.762690744" watchObservedRunningTime="2025-12-02 18:16:31.093469007 +0000 UTC m=+100.783087898" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.109250 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.109288 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.109298 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.109314 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.109326 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.123298 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6cm9t" podStartSLOduration=77.123282672 podStartE2EDuration="1m17.123282672s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.110227307 +0000 UTC m=+100.799846198" watchObservedRunningTime="2025-12-02 18:16:31.123282672 +0000 UTC m=+100.812901553" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.141571 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gndfz" podStartSLOduration=77.141547934 podStartE2EDuration="1m17.141547934s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.123966535 +0000 UTC m=+100.813585446" watchObservedRunningTime="2025-12-02 18:16:31.141547934 +0000 UTC m=+100.831166845" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.158604 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.158580664 podStartE2EDuration="1m18.158580664s" podCreationTimestamp="2025-12-02 18:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.141282575 +0000 UTC m=+100.830901476" watchObservedRunningTime="2025-12-02 18:16:31.158580664 +0000 UTC m=+100.848199545" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.211839 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.211876 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.211885 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.211900 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.211912 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.287798 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.287780062 podStartE2EDuration="1m18.287780062s" podCreationTimestamp="2025-12-02 18:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:31.287335597 +0000 UTC m=+100.976954498" watchObservedRunningTime="2025-12-02 18:16:31.287780062 +0000 UTC m=+100.977398943" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.315158 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.315201 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.315212 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.315230 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.315264 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.418056 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.418111 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.418120 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.418136 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.418147 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.520786 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.520868 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.520878 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.520900 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.520912 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.623388 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.623466 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.623491 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.623519 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.623536 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.726773 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.726844 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.726867 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.726893 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.726914 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.747029 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.747087 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.747125 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.747154 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.747176 4878 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T18:16:31Z","lastTransitionTime":"2025-12-02T18:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.815666 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt"] Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.816261 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.819351 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.819764 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.819858 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.821499 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.859805 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/246ff645-dcf6-4f39-83f3-8b0a214070c1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.859898 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/246ff645-dcf6-4f39-83f3-8b0a214070c1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.859935 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246ff645-dcf6-4f39-83f3-8b0a214070c1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.860088 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/246ff645-dcf6-4f39-83f3-8b0a214070c1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.860176 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/246ff645-dcf6-4f39-83f3-8b0a214070c1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.936849 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.936881 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:31 crc kubenswrapper[4878]: E1202 18:16:31.936976 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.936849 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:31 crc kubenswrapper[4878]: E1202 18:16:31.937062 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:31 crc kubenswrapper[4878]: E1202 18:16:31.937202 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.962197 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/246ff645-dcf6-4f39-83f3-8b0a214070c1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.962320 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/246ff645-dcf6-4f39-83f3-8b0a214070c1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.962375 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/246ff645-dcf6-4f39-83f3-8b0a214070c1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.962393 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246ff645-dcf6-4f39-83f3-8b0a214070c1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.962432 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/246ff645-dcf6-4f39-83f3-8b0a214070c1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.962476 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/246ff645-dcf6-4f39-83f3-8b0a214070c1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.962620 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/246ff645-dcf6-4f39-83f3-8b0a214070c1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.963741 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/246ff645-dcf6-4f39-83f3-8b0a214070c1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.973489 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246ff645-dcf6-4f39-83f3-8b0a214070c1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:31 crc kubenswrapper[4878]: I1202 18:16:31.989408 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/246ff645-dcf6-4f39-83f3-8b0a214070c1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q8mlt\" (UID: \"246ff645-dcf6-4f39-83f3-8b0a214070c1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:32 crc kubenswrapper[4878]: I1202 18:16:32.138836 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" Dec 02 18:16:32 crc kubenswrapper[4878]: W1202 18:16:32.161519 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod246ff645_dcf6_4f39_83f3_8b0a214070c1.slice/crio-cd0d59bd401c5817b5b30368dae53f52db836d130f052724b876bf59b7b4f3ca WatchSource:0}: Error finding container cd0d59bd401c5817b5b30368dae53f52db836d130f052724b876bf59b7b4f3ca: Status 404 returned error can't find the container with id cd0d59bd401c5817b5b30368dae53f52db836d130f052724b876bf59b7b4f3ca Dec 02 18:16:32 crc kubenswrapper[4878]: I1202 18:16:32.529062 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" event={"ID":"246ff645-dcf6-4f39-83f3-8b0a214070c1","Type":"ContainerStarted","Data":"b9e5078692e96dd5d1ff1dbcea70ba8b3730f9ae69e5eb402f3b20085a619207"} Dec 02 18:16:32 crc kubenswrapper[4878]: I1202 18:16:32.529164 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" event={"ID":"246ff645-dcf6-4f39-83f3-8b0a214070c1","Type":"ContainerStarted","Data":"cd0d59bd401c5817b5b30368dae53f52db836d130f052724b876bf59b7b4f3ca"} Dec 02 18:16:32 crc kubenswrapper[4878]: I1202 18:16:32.938289 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:32 crc kubenswrapper[4878]: E1202 18:16:32.938496 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:33 crc kubenswrapper[4878]: I1202 18:16:33.379591 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:33 crc kubenswrapper[4878]: E1202 18:16:33.379904 4878 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:16:33 crc kubenswrapper[4878]: E1202 18:16:33.380198 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs podName:09adc15b-14dd-4a05-b569-4168b9ced169 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:37.380174337 +0000 UTC m=+167.069793218 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs") pod "network-metrics-daemon-dlwt8" (UID: "09adc15b-14dd-4a05-b569-4168b9ced169") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 18:16:33 crc kubenswrapper[4878]: I1202 18:16:33.937339 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:33 crc kubenswrapper[4878]: I1202 18:16:33.937416 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:33 crc kubenswrapper[4878]: I1202 18:16:33.937465 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:33 crc kubenswrapper[4878]: E1202 18:16:33.937668 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:33 crc kubenswrapper[4878]: E1202 18:16:33.937821 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:33 crc kubenswrapper[4878]: E1202 18:16:33.937972 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:34 crc kubenswrapper[4878]: I1202 18:16:34.937487 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:34 crc kubenswrapper[4878]: E1202 18:16:34.937757 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:35 crc kubenswrapper[4878]: I1202 18:16:35.937205 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:35 crc kubenswrapper[4878]: I1202 18:16:35.937391 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:35 crc kubenswrapper[4878]: E1202 18:16:35.937540 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:35 crc kubenswrapper[4878]: E1202 18:16:35.937685 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:35 crc kubenswrapper[4878]: I1202 18:16:35.937229 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:35 crc kubenswrapper[4878]: E1202 18:16:35.938595 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:36 crc kubenswrapper[4878]: I1202 18:16:36.937261 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:36 crc kubenswrapper[4878]: E1202 18:16:36.937443 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:36 crc kubenswrapper[4878]: I1202 18:16:36.938708 4878 scope.go:117] "RemoveContainer" containerID="91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01" Dec 02 18:16:36 crc kubenswrapper[4878]: E1202 18:16:36.938999 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:16:38 crc kubenswrapper[4878]: I1202 18:16:38.880012 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:38 crc kubenswrapper[4878]: E1202 18:16:38.880118 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:38 crc kubenswrapper[4878]: I1202 18:16:38.880353 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:38 crc kubenswrapper[4878]: E1202 18:16:38.880402 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:38 crc kubenswrapper[4878]: I1202 18:16:38.880542 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:38 crc kubenswrapper[4878]: E1202 18:16:38.880604 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:38 crc kubenswrapper[4878]: I1202 18:16:38.881010 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:38 crc kubenswrapper[4878]: E1202 18:16:38.881076 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:40 crc kubenswrapper[4878]: I1202 18:16:40.936877 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:40 crc kubenswrapper[4878]: I1202 18:16:40.936886 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:40 crc kubenswrapper[4878]: I1202 18:16:40.937007 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:40 crc kubenswrapper[4878]: E1202 18:16:40.939222 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:40 crc kubenswrapper[4878]: I1202 18:16:40.939334 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:40 crc kubenswrapper[4878]: E1202 18:16:40.939474 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:40 crc kubenswrapper[4878]: E1202 18:16:40.939725 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:40 crc kubenswrapper[4878]: E1202 18:16:40.939854 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:42 crc kubenswrapper[4878]: I1202 18:16:42.937194 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:42 crc kubenswrapper[4878]: I1202 18:16:42.937354 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:42 crc kubenswrapper[4878]: E1202 18:16:42.937423 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:42 crc kubenswrapper[4878]: I1202 18:16:42.937457 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:42 crc kubenswrapper[4878]: E1202 18:16:42.937660 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:42 crc kubenswrapper[4878]: I1202 18:16:42.937744 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:42 crc kubenswrapper[4878]: E1202 18:16:42.937862 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:42 crc kubenswrapper[4878]: E1202 18:16:42.938004 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:44 crc kubenswrapper[4878]: I1202 18:16:44.937320 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:44 crc kubenswrapper[4878]: I1202 18:16:44.937585 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:44 crc kubenswrapper[4878]: I1202 18:16:44.937594 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:44 crc kubenswrapper[4878]: I1202 18:16:44.937774 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:44 crc kubenswrapper[4878]: E1202 18:16:44.938130 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:44 crc kubenswrapper[4878]: E1202 18:16:44.938354 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:44 crc kubenswrapper[4878]: E1202 18:16:44.938593 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:44 crc kubenswrapper[4878]: E1202 18:16:44.938745 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:46 crc kubenswrapper[4878]: I1202 18:16:46.937216 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:46 crc kubenswrapper[4878]: E1202 18:16:46.937398 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:46 crc kubenswrapper[4878]: I1202 18:16:46.937420 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:46 crc kubenswrapper[4878]: I1202 18:16:46.937594 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:46 crc kubenswrapper[4878]: E1202 18:16:46.937733 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:46 crc kubenswrapper[4878]: I1202 18:16:46.937756 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:46 crc kubenswrapper[4878]: E1202 18:16:46.937841 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:46 crc kubenswrapper[4878]: E1202 18:16:46.938147 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:48 crc kubenswrapper[4878]: I1202 18:16:48.937327 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:48 crc kubenswrapper[4878]: E1202 18:16:48.937519 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:48 crc kubenswrapper[4878]: I1202 18:16:48.937627 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:48 crc kubenswrapper[4878]: I1202 18:16:48.937708 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:48 crc kubenswrapper[4878]: I1202 18:16:48.937751 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:48 crc kubenswrapper[4878]: E1202 18:16:48.937829 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:48 crc kubenswrapper[4878]: E1202 18:16:48.938271 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:48 crc kubenswrapper[4878]: E1202 18:16:48.938360 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:48 crc kubenswrapper[4878]: I1202 18:16:48.938726 4878 scope.go:117] "RemoveContainer" containerID="91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01" Dec 02 18:16:48 crc kubenswrapper[4878]: E1202 18:16:48.938935 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-x5jzn_openshift-ovn-kubernetes(d160cfa4-9e2a-429d-b760-0cac6d467b9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" Dec 02 18:16:50 crc kubenswrapper[4878]: E1202 18:16:50.908593 4878 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.929631 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/1.log" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.930193 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/0.log" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.930254 4878 generic.go:334] "Generic (PLEG): container finished" podID="e79a8cec-20ba-4862-ba25-7de014466668" containerID="e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c" exitCode=1 Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.930291 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerDied","Data":"e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c"} Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.930333 4878 scope.go:117] "RemoveContainer" containerID="e536e31ac4dae183fb584058ca0d5b3503caa9d5eb684a7c7cd6408848554de7" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.930817 4878 scope.go:117] "RemoveContainer" containerID="e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c" Dec 02 18:16:50 crc kubenswrapper[4878]: E1202 18:16:50.930989 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6cm9t_openshift-multus(e79a8cec-20ba-4862-ba25-7de014466668)\"" pod="openshift-multus/multus-6cm9t" podUID="e79a8cec-20ba-4862-ba25-7de014466668" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.938871 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.938966 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:50 crc kubenswrapper[4878]: E1202 18:16:50.938992 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.939102 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:50 crc kubenswrapper[4878]: E1202 18:16:50.939197 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:50 crc kubenswrapper[4878]: E1202 18:16:50.939379 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.939826 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:50 crc kubenswrapper[4878]: E1202 18:16:50.940366 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:50 crc kubenswrapper[4878]: I1202 18:16:50.962641 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q8mlt" podStartSLOduration=96.962620304 podStartE2EDuration="1m36.962620304s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:16:32.554502737 +0000 UTC m=+102.244121658" watchObservedRunningTime="2025-12-02 18:16:50.962620304 +0000 UTC m=+120.652239195" Dec 02 18:16:51 crc kubenswrapper[4878]: I1202 18:16:51.936359 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/1.log" Dec 02 18:16:51 crc kubenswrapper[4878]: E1202 18:16:51.983097 4878 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:16:52 crc kubenswrapper[4878]: I1202 18:16:52.937215 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:52 crc kubenswrapper[4878]: I1202 18:16:52.937266 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:52 crc kubenswrapper[4878]: I1202 18:16:52.937336 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:52 crc kubenswrapper[4878]: E1202 18:16:52.937565 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:52 crc kubenswrapper[4878]: E1202 18:16:52.938430 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:52 crc kubenswrapper[4878]: I1202 18:16:52.938636 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:52 crc kubenswrapper[4878]: E1202 18:16:52.938677 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:52 crc kubenswrapper[4878]: E1202 18:16:52.938901 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:54 crc kubenswrapper[4878]: I1202 18:16:54.937435 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:54 crc kubenswrapper[4878]: I1202 18:16:54.937518 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:54 crc kubenswrapper[4878]: E1202 18:16:54.937619 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:54 crc kubenswrapper[4878]: I1202 18:16:54.937698 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:54 crc kubenswrapper[4878]: I1202 18:16:54.937891 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:54 crc kubenswrapper[4878]: E1202 18:16:54.937935 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:54 crc kubenswrapper[4878]: E1202 18:16:54.937995 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:54 crc kubenswrapper[4878]: E1202 18:16:54.938108 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:56 crc kubenswrapper[4878]: I1202 18:16:56.937713 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:56 crc kubenswrapper[4878]: I1202 18:16:56.937797 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:56 crc kubenswrapper[4878]: I1202 18:16:56.937730 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:56 crc kubenswrapper[4878]: I1202 18:16:56.937894 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:56 crc kubenswrapper[4878]: E1202 18:16:56.937953 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:16:56 crc kubenswrapper[4878]: E1202 18:16:56.938095 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:56 crc kubenswrapper[4878]: E1202 18:16:56.938178 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:56 crc kubenswrapper[4878]: E1202 18:16:56.938217 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:56 crc kubenswrapper[4878]: E1202 18:16:56.984791 4878 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:16:58 crc kubenswrapper[4878]: I1202 18:16:58.937694 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:16:58 crc kubenswrapper[4878]: I1202 18:16:58.937738 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:16:58 crc kubenswrapper[4878]: I1202 18:16:58.937808 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:16:58 crc kubenswrapper[4878]: E1202 18:16:58.937932 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:16:58 crc kubenswrapper[4878]: I1202 18:16:58.937947 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:16:58 crc kubenswrapper[4878]: E1202 18:16:58.938091 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:16:58 crc kubenswrapper[4878]: E1202 18:16:58.938134 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:16:58 crc kubenswrapper[4878]: E1202 18:16:58.938342 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:00 crc kubenswrapper[4878]: I1202 18:17:00.937131 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:00 crc kubenswrapper[4878]: I1202 18:17:00.937262 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:00 crc kubenswrapper[4878]: E1202 18:17:00.939120 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:17:00 crc kubenswrapper[4878]: I1202 18:17:00.939160 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:00 crc kubenswrapper[4878]: I1202 18:17:00.939138 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:00 crc kubenswrapper[4878]: E1202 18:17:00.939299 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:17:00 crc kubenswrapper[4878]: E1202 18:17:00.939393 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:17:00 crc kubenswrapper[4878]: E1202 18:17:00.939502 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:01 crc kubenswrapper[4878]: E1202 18:17:01.987503 4878 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:17:02 crc kubenswrapper[4878]: I1202 18:17:02.937710 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:02 crc kubenswrapper[4878]: I1202 18:17:02.937834 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:02 crc kubenswrapper[4878]: E1202 18:17:02.937997 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:02 crc kubenswrapper[4878]: I1202 18:17:02.938035 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:02 crc kubenswrapper[4878]: I1202 18:17:02.938075 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:02 crc kubenswrapper[4878]: E1202 18:17:02.938104 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:17:02 crc kubenswrapper[4878]: E1202 18:17:02.938262 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:17:02 crc kubenswrapper[4878]: E1202 18:17:02.938809 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:17:02 crc kubenswrapper[4878]: I1202 18:17:02.939733 4878 scope.go:117] "RemoveContainer" containerID="91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01" Dec 02 18:17:03 crc kubenswrapper[4878]: I1202 18:17:03.864506 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dlwt8"] Dec 02 18:17:03 crc kubenswrapper[4878]: I1202 18:17:03.865067 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:03 crc kubenswrapper[4878]: E1202 18:17:03.865266 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:03 crc kubenswrapper[4878]: I1202 18:17:03.986772 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/3.log" Dec 02 18:17:03 crc kubenswrapper[4878]: I1202 18:17:03.989590 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerStarted","Data":"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c"} Dec 02 18:17:03 crc kubenswrapper[4878]: I1202 18:17:03.989982 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:17:04 crc kubenswrapper[4878]: I1202 18:17:04.017757 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podStartSLOduration=110.017730547 podStartE2EDuration="1m50.017730547s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:04.014461855 +0000 UTC m=+133.704080766" watchObservedRunningTime="2025-12-02 18:17:04.017730547 +0000 UTC m=+133.707349458" Dec 02 18:17:04 crc kubenswrapper[4878]: I1202 18:17:04.937399 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:04 crc kubenswrapper[4878]: I1202 18:17:04.937446 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:04 crc kubenswrapper[4878]: E1202 18:17:04.937509 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:17:04 crc kubenswrapper[4878]: E1202 18:17:04.937565 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:17:04 crc kubenswrapper[4878]: I1202 18:17:04.937606 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:04 crc kubenswrapper[4878]: E1202 18:17:04.937745 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:17:05 crc kubenswrapper[4878]: I1202 18:17:05.937144 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:05 crc kubenswrapper[4878]: E1202 18:17:05.937460 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:05 crc kubenswrapper[4878]: I1202 18:17:05.937758 4878 scope.go:117] "RemoveContainer" containerID="e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c" Dec 02 18:17:06 crc kubenswrapper[4878]: I1202 18:17:06.937786 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:06 crc kubenswrapper[4878]: I1202 18:17:06.937836 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:06 crc kubenswrapper[4878]: E1202 18:17:06.938578 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:17:06 crc kubenswrapper[4878]: I1202 18:17:06.937958 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:06 crc kubenswrapper[4878]: E1202 18:17:06.938737 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:17:06 crc kubenswrapper[4878]: E1202 18:17:06.938897 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:17:06 crc kubenswrapper[4878]: E1202 18:17:06.989293 4878 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:17:07 crc kubenswrapper[4878]: I1202 18:17:07.003432 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/1.log" Dec 02 18:17:07 crc kubenswrapper[4878]: I1202 18:17:07.003524 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerStarted","Data":"8d8e064c8177248bf254025158f61f6dfa81e6a00b21ef6624c736c7a6a8fdaf"} Dec 02 18:17:07 crc kubenswrapper[4878]: I1202 18:17:07.937702 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:07 crc kubenswrapper[4878]: E1202 18:17:07.937835 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:08 crc kubenswrapper[4878]: I1202 18:17:08.937320 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:08 crc kubenswrapper[4878]: I1202 18:17:08.937459 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:08 crc kubenswrapper[4878]: E1202 18:17:08.937492 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:17:08 crc kubenswrapper[4878]: I1202 18:17:08.937605 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:08 crc kubenswrapper[4878]: E1202 18:17:08.937663 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:17:08 crc kubenswrapper[4878]: E1202 18:17:08.937831 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:17:09 crc kubenswrapper[4878]: I1202 18:17:09.937755 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:09 crc kubenswrapper[4878]: E1202 18:17:09.938003 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:10 crc kubenswrapper[4878]: I1202 18:17:10.937168 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:10 crc kubenswrapper[4878]: I1202 18:17:10.937304 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:10 crc kubenswrapper[4878]: I1202 18:17:10.937304 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:10 crc kubenswrapper[4878]: E1202 18:17:10.938414 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 18:17:10 crc kubenswrapper[4878]: E1202 18:17:10.938564 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 18:17:10 crc kubenswrapper[4878]: E1202 18:17:10.938749 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 18:17:11 crc kubenswrapper[4878]: I1202 18:17:11.937754 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:11 crc kubenswrapper[4878]: E1202 18:17:11.937959 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlwt8" podUID="09adc15b-14dd-4a05-b569-4168b9ced169" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.675719 4878 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.724261 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wdjfj"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.725135 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.725337 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-w6mf2"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.726705 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.726851 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.727904 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.728418 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.729040 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.729283 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.729867 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.730176 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.730394 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.730525 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffktl"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.732326 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.732572 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.732659 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.733138 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.747512 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.747768 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.750765 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.767835 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.772689 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ngj62"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.773085 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.773313 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.773423 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.773933 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.774200 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.774458 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.774756 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.774885 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.775523 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.775568 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.775700 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.775745 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.775902 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.775977 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.776049 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.776146 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.776626 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.776766 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.776884 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.776915 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.776942 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.777769 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.778107 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.784566 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvrhn"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.785141 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.785719 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.790192 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zbpll"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.790981 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zbpll" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.791488 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.791995 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.795606 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wkwj9"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.796370 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fvq9v"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.796799 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.797324 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.797818 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.798003 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802062 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805519 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802262 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802300 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805978 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.806196 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.806348 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802543 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.806590 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802578 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802659 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802671 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802695 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802709 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802716 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802724 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802766 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802782 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802836 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.802939 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.803040 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.803935 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.803977 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.804375 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.804758 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.804967 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805002 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805044 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805079 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805115 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805149 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805187 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805219 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805272 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805320 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805356 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805350 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805399 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.805440 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.811668 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.811881 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.812119 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.814311 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.814620 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.814720 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815131 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815143 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815380 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815494 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815738 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815858 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815929 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815936 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.816082 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.816106 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.816270 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.815859 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.816536 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.817301 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.817374 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.819335 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6hxvm"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.839985 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s5t58"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.856050 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.856640 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.860408 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876028 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdf471eb-c41c-40c0-b038-5ba883d2154a-serving-cert\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876075 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c1480df-aed9-4266-802c-d217699bd9ad-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876103 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-config\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876130 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1480df-aed9-4266-802c-d217699bd9ad-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876155 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876177 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876204 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkzsx\" (UniqueName: \"kubernetes.io/projected/0e8bdc92-b1ec-4300-8895-fc8e804455da-kube-api-access-zkzsx\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876225 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-trusted-ca-bundle\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876270 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-oauth-serving-cert\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876292 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08b0e33c-5604-46a5-b0fe-139feb379df8-serving-cert\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876315 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwnv9\" (UniqueName: \"kubernetes.io/projected/08b0e33c-5604-46a5-b0fe-139feb379df8-kube-api-access-nwnv9\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876338 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6951ee33-1345-4ae2-906d-fdc7cea4dc64-audit-dir\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876691 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c699a1f9-6c07-4c4d-8605-3988308d6914-machine-approver-tls\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876754 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2249a89f-5c41-4f17-a308-fe106e91ece9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876808 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2249a89f-5c41-4f17-a308-fe106e91ece9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876853 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08b0e33c-5604-46a5-b0fe-139feb379df8-trusted-ca\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876880 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54w2d\" (UniqueName: \"kubernetes.io/projected/c699a1f9-6c07-4c4d-8605-3988308d6914-kube-api-access-54w2d\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876904 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-serving-cert\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876927 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9m9\" (UniqueName: \"kubernetes.io/projected/3a9d4a2c-13b3-458c-92e8-058e9f1206dd-kube-api-access-9j9m9\") pod \"cluster-samples-operator-665b6dd947-t7x7s\" (UID: \"3a9d4a2c-13b3-458c-92e8-058e9f1206dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876952 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-config\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.876988 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877015 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877038 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877068 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4fw\" (UniqueName: \"kubernetes.io/projected/bdf471eb-c41c-40c0-b038-5ba883d2154a-kube-api-access-sg4fw\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877092 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c699a1f9-6c07-4c4d-8605-3988308d6914-auth-proxy-config\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877132 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877157 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877181 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-dir\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877206 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-oauth-config\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877229 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7tg\" (UniqueName: \"kubernetes.io/projected/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-kube-api-access-fp7tg\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877270 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6951ee33-1345-4ae2-906d-fdc7cea4dc64-node-pullsecrets\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877293 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-audit\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877316 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e8bdc92-b1ec-4300-8895-fc8e804455da-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877344 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-client-ca\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877364 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-images\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877395 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877417 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-encryption-config\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877443 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8bdc92-b1ec-4300-8895-fc8e804455da-serving-cert\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877468 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699a1f9-6c07-4c4d-8605-3988308d6914-config\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877549 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-config\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877585 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-policies\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877627 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-serving-cert\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877657 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mzj5\" (UniqueName: \"kubernetes.io/projected/2249a89f-5c41-4f17-a308-fe106e91ece9-kube-api-access-8mzj5\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877703 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877745 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-447rt\" (UniqueName: \"kubernetes.io/projected/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-kube-api-access-447rt\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877782 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897k9\" (UniqueName: \"kubernetes.io/projected/7c1480df-aed9-4266-802c-d217699bd9ad-kube-api-access-897k9\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877825 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877871 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-service-ca\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877961 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.877995 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-image-import-ca\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878026 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878218 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878262 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsph4\" (UniqueName: \"kubernetes.io/projected/6951ee33-1345-4ae2-906d-fdc7cea4dc64-kube-api-access-fsph4\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878288 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bszbt\" (UniqueName: \"kubernetes.io/projected/44e99cf7-e103-451a-91cc-0d610e4190a9-kube-api-access-bszbt\") pod \"downloads-7954f5f757-zbpll\" (UID: \"44e99cf7-e103-451a-91cc-0d610e4190a9\") " pod="openshift-console/downloads-7954f5f757-zbpll" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878310 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-etcd-client\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878339 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878369 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4zlk\" (UniqueName: \"kubernetes.io/projected/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-kube-api-access-n4zlk\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878395 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-config\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878420 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b0e33c-5604-46a5-b0fe-139feb379df8-config\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878442 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9d4a2c-13b3-458c-92e8-058e9f1206dd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-t7x7s\" (UID: \"3a9d4a2c-13b3-458c-92e8-058e9f1206dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.878466 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2249a89f-5c41-4f17-a308-fe106e91ece9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.879851 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.881464 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.881552 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.884772 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.885226 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.885305 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.886661 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.887049 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.887183 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.887706 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.887845 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.891607 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.897412 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.897615 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.898087 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.898416 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.898700 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbkfc"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.898984 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.899097 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.899255 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.899330 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.899440 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.899341 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.900304 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.900486 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.900371 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.901160 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.901725 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5p678"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.902026 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.902313 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.903895 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.910345 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.911263 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.912519 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.912745 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9vv2f"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.917674 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.912898 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.919243 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.920537 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.919287 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.912933 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.926770 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.927081 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.928834 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.929182 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.929599 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.930123 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.931627 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.932257 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-28qqz"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.933297 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.933648 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.934337 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.934798 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wkwj9"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.936255 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-th582"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.936872 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.937940 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.938182 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.938407 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.944341 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bv27h"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.945250 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.945457 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.946075 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7hvht"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.946142 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.946892 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wdjfj"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.947135 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbkfc"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.947252 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.946993 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.947799 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvrhn"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.948276 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.950076 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ngj62"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.952102 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-w6mf2"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.953587 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.953617 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.955196 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.956048 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.957784 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.958282 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s5t58"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.959374 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.961951 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.963191 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fvq9v"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.965116 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffktl"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.966265 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.967401 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zbpll"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.967903 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.968407 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6hxvm"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.970373 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9vv2f"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.971409 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.973140 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9zqsp"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.974072 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.974496 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.975589 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f59ml"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.976626 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.976671 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.977891 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bv27h"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979009 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979083 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979121 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979146 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-oauth-config\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979165 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-dir\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979188 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-config\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979206 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6951ee33-1345-4ae2-906d-fdc7cea4dc64-node-pullsecrets\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979220 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-audit\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979251 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7tg\" (UniqueName: \"kubernetes.io/projected/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-kube-api-access-fp7tg\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979271 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcb33a3-bcba-4acb-872e-e1676fdaa584-serving-cert\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979288 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-client-ca\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979305 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-images\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979321 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e8bdc92-b1ec-4300-8895-fc8e804455da-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979346 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979364 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-encryption-config\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979384 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8bdc92-b1ec-4300-8895-fc8e804455da-serving-cert\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979399 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-config\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979418 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-policies\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979435 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699a1f9-6c07-4c4d-8605-3988308d6914-config\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979433 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-dir\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979804 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mzj5\" (UniqueName: \"kubernetes.io/projected/2249a89f-5c41-4f17-a308-fe106e91ece9-kube-api-access-8mzj5\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979833 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979852 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-447rt\" (UniqueName: \"kubernetes.io/projected/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-kube-api-access-447rt\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979870 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-serving-cert\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979889 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979910 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-service-ca\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979928 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897k9\" (UniqueName: \"kubernetes.io/projected/7c1480df-aed9-4266-802c-d217699bd9ad-kube-api-access-897k9\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979958 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0e8bdc92-b1ec-4300-8895-fc8e804455da-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.979985 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smfh9\" (UniqueName: \"kubernetes.io/projected/4bcb33a3-bcba-4acb-872e-e1676fdaa584-kube-api-access-smfh9\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980009 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-image-import-ca\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980028 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980042 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6951ee33-1345-4ae2-906d-fdc7cea4dc64-node-pullsecrets\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980048 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980068 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980085 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980106 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsph4\" (UniqueName: \"kubernetes.io/projected/6951ee33-1345-4ae2-906d-fdc7cea4dc64-kube-api-access-fsph4\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980124 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bszbt\" (UniqueName: \"kubernetes.io/projected/44e99cf7-e103-451a-91cc-0d610e4190a9-kube-api-access-bszbt\") pod \"downloads-7954f5f757-zbpll\" (UID: \"44e99cf7-e103-451a-91cc-0d610e4190a9\") " pod="openshift-console/downloads-7954f5f757-zbpll" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980140 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-etcd-client\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980156 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980173 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-config\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980192 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b0e33c-5604-46a5-b0fe-139feb379df8-config\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980211 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9d4a2c-13b3-458c-92e8-058e9f1206dd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-t7x7s\" (UID: \"3a9d4a2c-13b3-458c-92e8-058e9f1206dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980228 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4zlk\" (UniqueName: \"kubernetes.io/projected/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-kube-api-access-n4zlk\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980264 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2249a89f-5c41-4f17-a308-fe106e91ece9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980279 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdf471eb-c41c-40c0-b038-5ba883d2154a-serving-cert\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980293 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-config\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980308 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c1480df-aed9-4266-802c-d217699bd9ad-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980323 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1480df-aed9-4266-802c-d217699bd9ad-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980338 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980353 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980369 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkzsx\" (UniqueName: \"kubernetes.io/projected/0e8bdc92-b1ec-4300-8895-fc8e804455da-kube-api-access-zkzsx\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980384 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-trusted-ca-bundle\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980399 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-oauth-serving-cert\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980413 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwnv9\" (UniqueName: \"kubernetes.io/projected/08b0e33c-5604-46a5-b0fe-139feb379df8-kube-api-access-nwnv9\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980427 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6951ee33-1345-4ae2-906d-fdc7cea4dc64-audit-dir\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980449 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08b0e33c-5604-46a5-b0fe-139feb379df8-serving-cert\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980463 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c699a1f9-6c07-4c4d-8605-3988308d6914-machine-approver-tls\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980478 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2249a89f-5c41-4f17-a308-fe106e91ece9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980495 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08b0e33c-5604-46a5-b0fe-139feb379df8-trusted-ca\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980511 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2249a89f-5c41-4f17-a308-fe106e91ece9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980528 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54w2d\" (UniqueName: \"kubernetes.io/projected/c699a1f9-6c07-4c4d-8605-3988308d6914-kube-api-access-54w2d\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980546 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-serving-cert\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980560 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9m9\" (UniqueName: \"kubernetes.io/projected/3a9d4a2c-13b3-458c-92e8-058e9f1206dd-kube-api-access-9j9m9\") pod \"cluster-samples-operator-665b6dd947-t7x7s\" (UID: \"3a9d4a2c-13b3-458c-92e8-058e9f1206dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980574 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-config\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980589 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980605 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980623 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980641 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4fw\" (UniqueName: \"kubernetes.io/projected/bdf471eb-c41c-40c0-b038-5ba883d2154a-kube-api-access-sg4fw\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980656 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c699a1f9-6c07-4c4d-8605-3988308d6914-auth-proxy-config\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.980672 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-service-ca-bundle\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.981137 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-audit\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.982616 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-service-ca\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.982636 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c699a1f9-6c07-4c4d-8605-3988308d6914-config\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.983198 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.983397 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-client-ca\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.983511 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5p678"] Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.985346 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-policies\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.985970 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-images\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.986017 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.986138 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-config\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.988180 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-image-import-ca\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.988577 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.990182 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-config\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.991733 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-oauth-serving-cert\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.992381 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.992548 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-serving-cert\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.993161 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.995377 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c699a1f9-6c07-4c4d-8605-3988308d6914-auth-proxy-config\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.996125 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08b0e33c-5604-46a5-b0fe-139feb379df8-trusted-ca\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.996497 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.996944 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-trusted-ca-bundle\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.997025 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2249a89f-5c41-4f17-a308-fe106e91ece9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.997250 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-config\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.997899 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08b0e33c-5604-46a5-b0fe-139feb379df8-config\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.998057 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6951ee33-1345-4ae2-906d-fdc7cea4dc64-config\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.998333 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.999050 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e8bdc92-b1ec-4300-8895-fc8e804455da-serving-cert\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.999103 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.999211 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.999536 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6951ee33-1345-4ae2-906d-fdc7cea4dc64-audit-dir\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:12 crc kubenswrapper[4878]: I1202 18:17:12.999909 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.000642 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-encryption-config\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.000782 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.000866 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.000882 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.001015 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdf471eb-c41c-40c0-b038-5ba883d2154a-serving-cert\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.001258 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.001739 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.001839 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c699a1f9-6c07-4c4d-8605-3988308d6914-machine-approver-tls\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.001903 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.002146 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6951ee33-1345-4ae2-906d-fdc7cea4dc64-etcd-client\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.002706 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.002770 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2249a89f-5c41-4f17-a308-fe106e91ece9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.002840 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9d4a2c-13b3-458c-92e8-058e9f1206dd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-t7x7s\" (UID: \"3a9d4a2c-13b3-458c-92e8-058e9f1206dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.003038 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c1480df-aed9-4266-802c-d217699bd9ad-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.003610 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-oauth-config\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.003900 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08b0e33c-5604-46a5-b0fe-139feb379df8-serving-cert\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.006204 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.007887 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.008360 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-serving-cert\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.010053 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c1480df-aed9-4266-802c-d217699bd9ad-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.010658 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.011473 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.012920 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.014444 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f59ml"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.015567 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-th582"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.017067 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7hvht"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.019409 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.020663 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.021998 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dmpvh"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.023226 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.023438 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dmpvh"] Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.029158 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.048496 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.068781 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.081212 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-service-ca-bundle\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.081276 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-config\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.081297 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcb33a3-bcba-4acb-872e-e1676fdaa584-serving-cert\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.081354 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smfh9\" (UniqueName: \"kubernetes.io/projected/4bcb33a3-bcba-4acb-872e-e1676fdaa584-kube-api-access-smfh9\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.081382 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.082517 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-service-ca-bundle\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.082539 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-config\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.082939 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcb33a3-bcba-4acb-872e-e1676fdaa584-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.084983 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcb33a3-bcba-4acb-872e-e1676fdaa584-serving-cert\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.088145 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.127759 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.147113 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.168418 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182159 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-config\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182316 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f324597d-0fdb-49dd-aaa2-9f71b2759bcb-metrics-tls\") pod \"dns-operator-744455d44c-s5t58\" (UID: \"f324597d-0fdb-49dd-aaa2-9f71b2759bcb\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182395 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-certificates\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182435 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182487 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2x2m\" (UniqueName: \"kubernetes.io/projected/f324597d-0fdb-49dd-aaa2-9f71b2759bcb-kube-api-access-p2x2m\") pod \"dns-operator-744455d44c-s5t58\" (UID: \"f324597d-0fdb-49dd-aaa2-9f71b2759bcb\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182585 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182650 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-tls\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182678 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182735 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553264d0-196d-4174-816e-ba4803d6a893-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182798 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/763bb008-97f2-4e90-965c-5a7537ff0a57-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182890 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-bound-sa-token\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.182965 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.183034 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gg7\" (UniqueName: \"kubernetes.io/projected/435af086-d5fb-4f55-9c52-bfab176ee753-kube-api-access-j4gg7\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.183062 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-trusted-ca\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.183148 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435af086-d5fb-4f55-9c52-bfab176ee753-serving-cert\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.183219 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553264d0-196d-4174-816e-ba4803d6a893-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.183294 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-client-ca\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.183977 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:13.683958596 +0000 UTC m=+143.373577477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.184332 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/763bb008-97f2-4e90-965c-5a7537ff0a57-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.184376 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgcqx\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-kube-api-access-jgcqx\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.184405 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwwfx\" (UniqueName: \"kubernetes.io/projected/553264d0-196d-4174-816e-ba4803d6a893-kube-api-access-zwwfx\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.184786 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-config\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.188209 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.207508 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.228194 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.247986 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.268677 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.285782 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.285979 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/763bb008-97f2-4e90-965c-5a7537ff0a57-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286014 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwwfx\" (UniqueName: \"kubernetes.io/projected/553264d0-196d-4174-816e-ba4803d6a893-kube-api-access-zwwfx\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286055 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32f0172-888e-4c74-9e70-becd273d49d8-serving-cert\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.286071 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:13.786048971 +0000 UTC m=+143.475667852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286095 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54ef3152-6020-4822-ab68-b4c3ba44dc1c-proxy-tls\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286129 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14702b8a-c457-4730-919c-7340aea9738e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286155 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-config\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286179 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-mountpoint-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286218 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npjc8\" (UniqueName: \"kubernetes.io/projected/a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b-kube-api-access-npjc8\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6fbw\" (UID: \"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286260 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8025c62a-fb3f-4566-8fd6-571ddb400da3-metrics-tls\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286299 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-metrics-certs\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286393 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286426 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn48b\" (UniqueName: \"kubernetes.io/projected/3b6800a2-0b99-4c3b-9e5e-833a245bd7be-kube-api-access-kn48b\") pod \"migrator-59844c95c7-dtnr7\" (UID: \"3b6800a2-0b99-4c3b-9e5e-833a245bd7be\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286455 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d2265b6f-d33f-4bea-9934-1ece55c51d35-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286481 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-encryption-config\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286491 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/763bb008-97f2-4e90-965c-5a7537ff0a57-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286543 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/298c0078-6d02-4963-a1fb-0f6713e6d369-config-volume\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286567 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-socket-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286607 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38d2716d-f5de-4242-a170-624490092b98-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286660 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-tls\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286686 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sw9s\" (UniqueName: \"kubernetes.io/projected/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-kube-api-access-4sw9s\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286763 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4ht8\" (UniqueName: \"kubernetes.io/projected/298c0078-6d02-4963-a1fb-0f6713e6d369-kube-api-access-h4ht8\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286796 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9878\" (UniqueName: \"kubernetes.io/projected/e5945b22-4fa1-4d4e-aa84-990d28f1e423-kube-api-access-f9878\") pod \"ingress-canary-7hvht\" (UID: \"e5945b22-4fa1-4d4e-aa84-990d28f1e423\") " pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286838 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-smztw\" (UID: \"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286867 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b22451-6028-4e14-bd65-30d12b0242b3-config\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286895 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/883f2381-011d-410d-84cd-7ed27e099ebf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286942 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8025c62a-fb3f-4566-8fd6-571ddb400da3-config-volume\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.286968 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-registration-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287090 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw8z6\" (UniqueName: \"kubernetes.io/projected/bfed419b-bf76-4045-9056-50ca33e1686b-kube-api-access-xw8z6\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287318 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c0a71e8-b6e5-44c8-a813-9177684ab97e-service-ca-bundle\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287343 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54ef3152-6020-4822-ab68-b4c3ba44dc1c-images\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287432 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4gg7\" (UniqueName: \"kubernetes.io/projected/435af086-d5fb-4f55-9c52-bfab176ee753-kube-api-access-j4gg7\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287548 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blfmq\" (UniqueName: \"kubernetes.io/projected/37ed9429-a67d-4168-b5e9-211eddd1abb1-kube-api-access-blfmq\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287569 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38d2716d-f5de-4242-a170-624490092b98-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287662 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-serving-cert\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287722 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-default-certificate\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287734 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.287741 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-dir\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288145 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kswcs\" (UniqueName: \"kubernetes.io/projected/883f2381-011d-410d-84cd-7ed27e099ebf-kube-api-access-kswcs\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288272 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-service-ca\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288668 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe86a337-8c96-4b07-b3b6-a97315fa1029-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9vv2f\" (UID: \"fe86a337-8c96-4b07-b3b6-a97315fa1029\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288742 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5945b22-4fa1-4d4e-aa84-990d28f1e423-cert\") pod \"ingress-canary-7hvht\" (UID: \"e5945b22-4fa1-4d4e-aa84-990d28f1e423\") " pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288771 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ptm\" (UniqueName: \"kubernetes.io/projected/cbe20149-0391-43f7-957b-ad3b18f54736-kube-api-access-h5ptm\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288875 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vg5k\" (UniqueName: \"kubernetes.io/projected/4c0a71e8-b6e5-44c8-a813-9177684ab97e-kube-api-access-4vg5k\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288904 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-client\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288938 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgcqx\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-kube-api-access-jgcqx\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.288962 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289114 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9jb\" (UniqueName: \"kubernetes.io/projected/d2265b6f-d33f-4bea-9934-1ece55c51d35-kube-api-access-zb9jb\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289162 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-node-bootstrap-token\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289287 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-stats-auth\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289343 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-client\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289394 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-config\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289480 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbe20149-0391-43f7-957b-ad3b18f54736-webhook-cert\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289536 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/298c0078-6d02-4963-a1fb-0f6713e6d369-secret-volume\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289635 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-plugins-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.289789 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-config\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292458 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhjdc\" (UniqueName: \"kubernetes.io/projected/1ba68bd9-d567-48e5-b296-0a95fdf406b5-kube-api-access-vhjdc\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292552 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdjj\" (UniqueName: \"kubernetes.io/projected/14702b8a-c457-4730-919c-7340aea9738e-kube-api-access-nmdjj\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.291303 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-config\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.290527 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-config\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292352 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-tls\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292714 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f324597d-0fdb-49dd-aaa2-9f71b2759bcb-metrics-tls\") pod \"dns-operator-744455d44c-s5t58\" (UID: \"f324597d-0fdb-49dd-aaa2-9f71b2759bcb\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292779 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzx7r\" (UniqueName: \"kubernetes.io/projected/6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab-kube-api-access-wzx7r\") pod \"package-server-manager-789f6589d5-smztw\" (UID: \"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292808 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-certificates\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292852 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ba68bd9-d567-48e5-b296-0a95fdf406b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.292875 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-serving-cert\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.293875 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-certificates\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.293945 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bfed419b-bf76-4045-9056-50ca33e1686b-signing-cabundle\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.293976 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2x2m\" (UniqueName: \"kubernetes.io/projected/f324597d-0fdb-49dd-aaa2-9f71b2759bcb-kube-api-access-p2x2m\") pod \"dns-operator-744455d44c-s5t58\" (UID: \"f324597d-0fdb-49dd-aaa2-9f71b2759bcb\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294025 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-certs\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294044 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkmws\" (UniqueName: \"kubernetes.io/projected/38d2716d-f5de-4242-a170-624490092b98-kube-api-access-fkmws\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294191 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc0cb11b-bb32-413e-bbc3-179943e19f88-profile-collector-cert\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294278 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14702b8a-c457-4730-919c-7340aea9738e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294312 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294444 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6fbw\" (UID: \"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294552 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-csi-data-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294649 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cbe20149-0391-43f7-957b-ad3b18f54736-tmpfs\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294765 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-ca\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294852 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.294958 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295045 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ba68bd9-d567-48e5-b296-0a95fdf406b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295158 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553264d0-196d-4174-816e-ba4803d6a893-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295278 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-config\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295394 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/763bb008-97f2-4e90-965c-5a7537ff0a57-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295501 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fst5\" (UniqueName: \"kubernetes.io/projected/f45d88bc-0c44-4669-8394-081c7e4a8035-kube-api-access-6fst5\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295597 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6gjt\" (UniqueName: \"kubernetes.io/projected/54ef3152-6020-4822-ab68-b4c3ba44dc1c-kube-api-access-d6gjt\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295683 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ba68bd9-d567-48e5-b296-0a95fdf406b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295541 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295757 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295929 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn952\" (UniqueName: \"kubernetes.io/projected/cc0cb11b-bb32-413e-bbc3-179943e19f88-kube-api-access-dn952\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296052 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-bound-sa-token\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.295860 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553264d0-196d-4174-816e-ba4803d6a893-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296164 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5tgz\" (UniqueName: \"kubernetes.io/projected/f34c5862-2f1c-4e77-b3e2-b30852607129-kube-api-access-n5tgz\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296171 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f324597d-0fdb-49dd-aaa2-9f71b2759bcb-metrics-tls\") pod \"dns-operator-744455d44c-s5t58\" (UID: \"f324597d-0fdb-49dd-aaa2-9f71b2759bcb\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296248 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggg7z\" (UniqueName: \"kubernetes.io/projected/8025c62a-fb3f-4566-8fd6-571ddb400da3-kube-api-access-ggg7z\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296293 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-policies\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296327 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296354 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d2265b6f-d33f-4bea-9934-1ece55c51d35-srv-cert\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296387 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-trusted-ca\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296419 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ghd9\" (UniqueName: \"kubernetes.io/projected/fe86a337-8c96-4b07-b3b6-a97315fa1029-kube-api-access-8ghd9\") pod \"multus-admission-controller-857f4d67dd-9vv2f\" (UID: \"fe86a337-8c96-4b07-b3b6-a97315fa1029\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296444 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbe20149-0391-43f7-957b-ad3b18f54736-apiservice-cert\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296486 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bfed419b-bf76-4045-9056-50ca33e1686b-signing-key\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296514 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435af086-d5fb-4f55-9c52-bfab176ee753-serving-cert\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.296543 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:13.796530299 +0000 UTC m=+143.486149180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296578 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883f2381-011d-410d-84cd-7ed27e099ebf-proxy-tls\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296693 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03b22451-6028-4e14-bd65-30d12b0242b3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296766 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54ef3152-6020-4822-ab68-b4c3ba44dc1c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296798 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553264d0-196d-4174-816e-ba4803d6a893-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296825 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-client-ca\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296849 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsnm\" (UniqueName: \"kubernetes.io/projected/b32f0172-888e-4c74-9e70-becd273d49d8-kube-api-access-2hsnm\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296879 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296902 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b22451-6028-4e14-bd65-30d12b0242b3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296937 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.296958 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc0cb11b-bb32-413e-bbc3-179943e19f88-srv-cert\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.297820 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-client-ca\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.299305 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435af086-d5fb-4f55-9c52-bfab176ee753-serving-cert\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.299700 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-trusted-ca\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.300512 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553264d0-196d-4174-816e-ba4803d6a893-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.301156 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.308967 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/763bb008-97f2-4e90-965c-5a7537ff0a57-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.309384 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.327618 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.353864 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.368805 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.388626 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398356 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.398511 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:13.89848127 +0000 UTC m=+143.588100171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398597 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-dir\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398633 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kswcs\" (UniqueName: \"kubernetes.io/projected/883f2381-011d-410d-84cd-7ed27e099ebf-kube-api-access-kswcs\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398671 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-default-certificate\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398704 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-service-ca\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398702 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-dir\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398741 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe86a337-8c96-4b07-b3b6-a97315fa1029-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9vv2f\" (UID: \"fe86a337-8c96-4b07-b3b6-a97315fa1029\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398764 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5945b22-4fa1-4d4e-aa84-990d28f1e423-cert\") pod \"ingress-canary-7hvht\" (UID: \"e5945b22-4fa1-4d4e-aa84-990d28f1e423\") " pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398789 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ptm\" (UniqueName: \"kubernetes.io/projected/cbe20149-0391-43f7-957b-ad3b18f54736-kube-api-access-h5ptm\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398814 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vg5k\" (UniqueName: \"kubernetes.io/projected/4c0a71e8-b6e5-44c8-a813-9177684ab97e-kube-api-access-4vg5k\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398834 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-client\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398865 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398903 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9jb\" (UniqueName: \"kubernetes.io/projected/d2265b6f-d33f-4bea-9934-1ece55c51d35-kube-api-access-zb9jb\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398923 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-node-bootstrap-token\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398949 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-stats-auth\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398970 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-client\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.398998 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbe20149-0391-43f7-957b-ad3b18f54736-webhook-cert\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399024 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/298c0078-6d02-4963-a1fb-0f6713e6d369-secret-volume\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399047 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-plugins-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhjdc\" (UniqueName: \"kubernetes.io/projected/1ba68bd9-d567-48e5-b296-0a95fdf406b5-kube-api-access-vhjdc\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399110 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmdjj\" (UniqueName: \"kubernetes.io/projected/14702b8a-c457-4730-919c-7340aea9738e-kube-api-access-nmdjj\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399144 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzx7r\" (UniqueName: \"kubernetes.io/projected/6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab-kube-api-access-wzx7r\") pod \"package-server-manager-789f6589d5-smztw\" (UID: \"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399167 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ba68bd9-d567-48e5-b296-0a95fdf406b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399190 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bfed419b-bf76-4045-9056-50ca33e1686b-signing-cabundle\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399212 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-serving-cert\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399288 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-certs\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399320 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkmws\" (UniqueName: \"kubernetes.io/projected/38d2716d-f5de-4242-a170-624490092b98-kube-api-access-fkmws\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399352 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc0cb11b-bb32-413e-bbc3-179943e19f88-profile-collector-cert\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399381 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14702b8a-c457-4730-919c-7340aea9738e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399413 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6fbw\" (UID: \"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399442 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-csi-data-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399463 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cbe20149-0391-43f7-957b-ad3b18f54736-tmpfs\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399496 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-ca\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399521 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399544 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ba68bd9-d567-48e5-b296-0a95fdf406b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399576 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-config\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399611 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fst5\" (UniqueName: \"kubernetes.io/projected/f45d88bc-0c44-4669-8394-081c7e4a8035-kube-api-access-6fst5\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399646 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6gjt\" (UniqueName: \"kubernetes.io/projected/54ef3152-6020-4822-ab68-b4c3ba44dc1c-kube-api-access-d6gjt\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399687 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ba68bd9-d567-48e5-b296-0a95fdf406b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399709 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399718 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399756 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn952\" (UniqueName: \"kubernetes.io/projected/cc0cb11b-bb32-413e-bbc3-179943e19f88-kube-api-access-dn952\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399810 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5tgz\" (UniqueName: \"kubernetes.io/projected/f34c5862-2f1c-4e77-b3e2-b30852607129-kube-api-access-n5tgz\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399848 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggg7z\" (UniqueName: \"kubernetes.io/projected/8025c62a-fb3f-4566-8fd6-571ddb400da3-kube-api-access-ggg7z\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399883 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-policies\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399904 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-service-ca\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399937 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-plugins-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.399948 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d2265b6f-d33f-4bea-9934-1ece55c51d35-srv-cert\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400224 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400320 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ghd9\" (UniqueName: \"kubernetes.io/projected/fe86a337-8c96-4b07-b3b6-a97315fa1029-kube-api-access-8ghd9\") pod \"multus-admission-controller-857f4d67dd-9vv2f\" (UID: \"fe86a337-8c96-4b07-b3b6-a97315fa1029\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400360 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbe20149-0391-43f7-957b-ad3b18f54736-apiservice-cert\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400404 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bfed419b-bf76-4045-9056-50ca33e1686b-signing-key\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400443 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883f2381-011d-410d-84cd-7ed27e099ebf-proxy-tls\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400481 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03b22451-6028-4e14-bd65-30d12b0242b3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400518 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54ef3152-6020-4822-ab68-b4c3ba44dc1c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400555 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsnm\" (UniqueName: \"kubernetes.io/projected/b32f0172-888e-4c74-9e70-becd273d49d8-kube-api-access-2hsnm\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400592 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400626 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400659 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc0cb11b-bb32-413e-bbc3-179943e19f88-srv-cert\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400691 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b22451-6028-4e14-bd65-30d12b0242b3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400758 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32f0172-888e-4c74-9e70-becd273d49d8-serving-cert\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400792 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54ef3152-6020-4822-ab68-b4c3ba44dc1c-proxy-tls\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400832 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14702b8a-c457-4730-919c-7340aea9738e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400863 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-config\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400895 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-mountpoint-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.400964 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8025c62a-fb3f-4566-8fd6-571ddb400da3-metrics-tls\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401013 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npjc8\" (UniqueName: \"kubernetes.io/projected/a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b-kube-api-access-npjc8\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6fbw\" (UID: \"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401052 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-metrics-certs\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401126 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn48b\" (UniqueName: \"kubernetes.io/projected/3b6800a2-0b99-4c3b-9e5e-833a245bd7be-kube-api-access-kn48b\") pod \"migrator-59844c95c7-dtnr7\" (UID: \"3b6800a2-0b99-4c3b-9e5e-833a245bd7be\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401165 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-encryption-config\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401204 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d2265b6f-d33f-4bea-9934-1ece55c51d35-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401278 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/298c0078-6d02-4963-a1fb-0f6713e6d369-config-volume\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401310 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-socket-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401350 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38d2716d-f5de-4242-a170-624490092b98-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401389 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sw9s\" (UniqueName: \"kubernetes.io/projected/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-kube-api-access-4sw9s\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401425 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4ht8\" (UniqueName: \"kubernetes.io/projected/298c0078-6d02-4963-a1fb-0f6713e6d369-kube-api-access-h4ht8\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401459 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9878\" (UniqueName: \"kubernetes.io/projected/e5945b22-4fa1-4d4e-aa84-990d28f1e423-kube-api-access-f9878\") pod \"ingress-canary-7hvht\" (UID: \"e5945b22-4fa1-4d4e-aa84-990d28f1e423\") " pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401493 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-smztw\" (UID: \"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401547 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b22451-6028-4e14-bd65-30d12b0242b3-config\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401578 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/883f2381-011d-410d-84cd-7ed27e099ebf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401642 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8025c62a-fb3f-4566-8fd6-571ddb400da3-config-volume\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401678 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-registration-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401734 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw8z6\" (UniqueName: \"kubernetes.io/projected/bfed419b-bf76-4045-9056-50ca33e1686b-kube-api-access-xw8z6\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401766 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54ef3152-6020-4822-ab68-b4c3ba44dc1c-images\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401798 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c0a71e8-b6e5-44c8-a813-9177684ab97e-service-ca-bundle\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401811 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ba68bd9-d567-48e5-b296-0a95fdf406b5-trusted-ca\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401847 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blfmq\" (UniqueName: \"kubernetes.io/projected/37ed9429-a67d-4168-b5e9-211eddd1abb1-kube-api-access-blfmq\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401882 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38d2716d-f5de-4242-a170-624490092b98-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.401945 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-serving-cert\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.402551 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:13.902533088 +0000 UTC m=+143.592151979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.402813 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-client\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.402964 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-csi-data-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.404134 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d2265b6f-d33f-4bea-9934-1ece55c51d35-srv-cert\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.404343 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54ef3152-6020-4822-ab68-b4c3ba44dc1c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.404575 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-registration-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.404574 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-mountpoint-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.404925 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/37ed9429-a67d-4168-b5e9-211eddd1abb1-socket-dir\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.404970 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-config\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.405062 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/cbe20149-0391-43f7-957b-ad3b18f54736-tmpfs\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.405351 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b32f0172-888e-4c74-9e70-becd273d49d8-etcd-ca\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.406453 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cc0cb11b-bb32-413e-bbc3-179943e19f88-profile-collector-cert\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.406878 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/883f2381-011d-410d-84cd-7ed27e099ebf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.407483 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.407642 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.407940 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32f0172-888e-4c74-9e70-becd273d49d8-serving-cert\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.409003 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ba68bd9-d567-48e5-b296-0a95fdf406b5-metrics-tls\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.409324 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/298c0078-6d02-4963-a1fb-0f6713e6d369-secret-volume\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.410309 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d2265b6f-d33f-4bea-9934-1ece55c51d35-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.427954 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.450032 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.458761 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b22451-6028-4e14-bd65-30d12b0242b3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.468006 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.475470 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b22451-6028-4e14-bd65-30d12b0242b3-config\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.489132 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.502542 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.502762 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.002731719 +0000 UTC m=+143.692350600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.503165 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.503521 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.003508906 +0000 UTC m=+143.693127787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.508317 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.517708 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6fbw\" (UID: \"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.528546 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.568307 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.588606 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.605403 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.605645 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.105617661 +0000 UTC m=+143.795236532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.606304 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.606866 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.106849624 +0000 UTC m=+143.796468505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.608064 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.616860 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/883f2381-011d-410d-84cd-7ed27e099ebf-proxy-tls\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.628356 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.648660 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.660340 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38d2716d-f5de-4242-a170-624490092b98-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.668561 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.677367 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cc0cb11b-bb32-413e-bbc3-179943e19f88-srv-cert\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.696317 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.706887 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38d2716d-f5de-4242-a170-624490092b98-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.707381 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.707801 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.207763419 +0000 UTC m=+143.897382330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.708931 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.708969 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.709732 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.209669894 +0000 UTC m=+143.899288815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.728141 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.748165 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.768336 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.788219 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.808788 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.810222 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.810438 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.310409564 +0000 UTC m=+144.000028455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.810911 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.811289 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.311275333 +0000 UTC m=+144.000894214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.812813 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe86a337-8c96-4b07-b3b6-a97315fa1029-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9vv2f\" (UID: \"fe86a337-8c96-4b07-b3b6-a97315fa1029\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.828143 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.834911 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14702b8a-c457-4730-919c-7340aea9738e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.847528 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.868459 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.888594 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.908157 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.912825 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.913077 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.413045307 +0000 UTC m=+144.102664218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.914823 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:13 crc kubenswrapper[4878]: E1202 18:17:13.915432 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.415413658 +0000 UTC m=+144.105032569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.919663 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14702b8a-c457-4730-919c-7340aea9738e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.927517 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.937264 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.939529 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-smztw\" (UID: \"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.946206 4878 request.go:700] Waited for 1.016373599s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackageserver-service-cert&limit=500&resourceVersion=0 Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.949107 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.954165 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbe20149-0391-43f7-957b-ad3b18f54736-webhook-cert\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.956742 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbe20149-0391-43f7-957b-ad3b18f54736-apiservice-cert\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.968282 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.976270 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/298c0078-6d02-4963-a1fb-0f6713e6d369-config-volume\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:13 crc kubenswrapper[4878]: I1202 18:17:13.989276 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.007493 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.015745 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c0a71e8-b6e5-44c8-a813-9177684ab97e-service-ca-bundle\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.016068 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.017182 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.516209909 +0000 UTC m=+144.205828810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.017635 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.020444 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.520400943 +0000 UTC m=+144.210019844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.027503 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.048797 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.053777 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-default-certificate\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.068738 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.075591 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54ef3152-6020-4822-ab68-b4c3ba44dc1c-images\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.088914 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.094810 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-stats-auth\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.109007 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.119468 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c0a71e8-b6e5-44c8-a813-9177684ab97e-metrics-certs\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.119637 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.119849 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.619812216 +0000 UTC m=+144.309431097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.120537 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.120773 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.620760809 +0000 UTC m=+144.310379680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.128067 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.148420 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.168039 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.178157 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54ef3152-6020-4822-ab68-b4c3ba44dc1c-proxy-tls\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.188014 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.208251 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.221432 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.221858 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.72183508 +0000 UTC m=+144.411453971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.222532 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.223087 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.723078483 +0000 UTC m=+144.412697354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.227865 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.247569 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.251097 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bfed419b-bf76-4045-9056-50ca33e1686b-signing-cabundle\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.269850 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.280632 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bfed419b-bf76-4045-9056-50ca33e1686b-signing-key\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.289303 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.308888 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.323796 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.324197 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.824179024 +0000 UTC m=+144.513797905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.324631 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.324974 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.824957021 +0000 UTC m=+144.514575902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.327670 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.347903 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.371265 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.388867 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.398049 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-serving-cert\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.398957 4878 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.399081 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5945b22-4fa1-4d4e-aa84-990d28f1e423-cert podName:e5945b22-4fa1-4d4e-aa84-990d28f1e423 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.899055681 +0000 UTC m=+144.588674632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5945b22-4fa1-4d4e-aa84-990d28f1e423-cert") pod "ingress-canary-7hvht" (UID: "e5945b22-4fa1-4d4e-aa84-990d28f1e423") : failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.400076 4878 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.400118 4878 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.400166 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-node-bootstrap-token podName:f34c5862-2f1c-4e77-b3e2-b30852607129 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.900143858 +0000 UTC m=+144.589762749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-node-bootstrap-token") pod "machine-config-server-9zqsp" (UID: "f34c5862-2f1c-4e77-b3e2-b30852607129") : failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.400191 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-client podName:f45d88bc-0c44-4669-8394-081c7e4a8035 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.900180889 +0000 UTC m=+144.589799790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-client") pod "apiserver-7bbb656c7d-fvb6t" (UID: "f45d88bc-0c44-4669-8394-081c7e4a8035") : failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.402924 4878 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.403050 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-certs podName:f34c5862-2f1c-4e77-b3e2-b30852607129 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.903030776 +0000 UTC m=+144.592649657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-certs") pod "machine-config-server-9zqsp" (UID: "f34c5862-2f1c-4e77-b3e2-b30852607129") : failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.403349 4878 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.403562 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-encryption-config podName:f45d88bc-0c44-4669-8394-081c7e4a8035 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.903535223 +0000 UTC m=+144.593154164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-encryption-config") pod "apiserver-7bbb656c7d-fvb6t" (UID: "f45d88bc-0c44-4669-8394-081c7e4a8035") : failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404111 4878 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404167 4878 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404187 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-serving-ca podName:f45d88bc-0c44-4669-8394-081c7e4a8035 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.904178226 +0000 UTC m=+144.593797187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-serving-ca") pod "apiserver-7bbb656c7d-fvb6t" (UID: "f45d88bc-0c44-4669-8394-081c7e4a8035") : failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404281 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-trusted-ca-bundle podName:f45d88bc-0c44-4669-8394-081c7e4a8035 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.904216797 +0000 UTC m=+144.593835728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-trusted-ca-bundle") pod "apiserver-7bbb656c7d-fvb6t" (UID: "f45d88bc-0c44-4669-8394-081c7e4a8035") : failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404325 4878 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404371 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8025c62a-fb3f-4566-8fd6-571ddb400da3-metrics-tls podName:8025c62a-fb3f-4566-8fd6-571ddb400da3 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.904355692 +0000 UTC m=+144.593974673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8025c62a-fb3f-4566-8fd6-571ddb400da3-metrics-tls") pod "dns-default-dmpvh" (UID: "8025c62a-fb3f-4566-8fd6-571ddb400da3") : failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404418 4878 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404454 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-policies podName:f45d88bc-0c44-4669-8394-081c7e4a8035 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.904441975 +0000 UTC m=+144.594060976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-policies") pod "apiserver-7bbb656c7d-fvb6t" (UID: "f45d88bc-0c44-4669-8394-081c7e4a8035") : failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404462 4878 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404631 4878 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404742 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-config podName:f047f11c-e08f-47b0-965b-0cc2e8d2bd38 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.904717124 +0000 UTC m=+144.594336045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-config") pod "service-ca-operator-777779d784-bv27h" (UID: "f047f11c-e08f-47b0-965b-0cc2e8d2bd38") : failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.404855 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8025c62a-fb3f-4566-8fd6-571ddb400da3-config-volume podName:8025c62a-fb3f-4566-8fd6-571ddb400da3 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.904804437 +0000 UTC m=+144.594423438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/8025c62a-fb3f-4566-8fd6-571ddb400da3-config-volume") pod "dns-default-dmpvh" (UID: "8025c62a-fb3f-4566-8fd6-571ddb400da3") : failed to sync configmap cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.405296 4878 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.405444 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-serving-cert podName:f45d88bc-0c44-4669-8394-081c7e4a8035 nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.905378426 +0000 UTC m=+144.594997297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-serving-cert") pod "apiserver-7bbb656c7d-fvb6t" (UID: "f45d88bc-0c44-4669-8394-081c7e4a8035") : failed to sync secret cache: timed out waiting for the condition Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.408421 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.425783 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.425980 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.925951619 +0000 UTC m=+144.615570500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.426838 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.427401 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:14.927313056 +0000 UTC m=+144.616931937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.427506 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.448104 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.467522 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.489115 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.507497 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.527826 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.528966 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.529156 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.029114631 +0000 UTC m=+144.718733522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.529279 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.529729 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.029708811 +0000 UTC m=+144.719327692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.549034 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.568972 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.588475 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.608569 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.628424 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.630844 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.631169 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.131128184 +0000 UTC m=+144.820747095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.631597 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.632009 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.131995784 +0000 UTC m=+144.821614665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.648032 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.668379 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.688608 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.708552 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.728585 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.732731 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.732948 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.232917879 +0000 UTC m=+144.922536780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.733879 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.734386 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.234369979 +0000 UTC m=+144.923988860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.750625 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.769057 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.788880 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.808115 4878 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.829136 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.834921 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.835067 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.335041576 +0000 UTC m=+145.024660457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.836082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.836438 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.336430373 +0000 UTC m=+145.026049254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.848119 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.886354 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7tg\" (UniqueName: \"kubernetes.io/projected/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-kube-api-access-fp7tg\") pod \"oauth-openshift-558db77b4-wvrhn\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.916669 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mzj5\" (UniqueName: \"kubernetes.io/projected/2249a89f-5c41-4f17-a308-fe106e91ece9-kube-api-access-8mzj5\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.924931 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897k9\" (UniqueName: \"kubernetes.io/projected/7c1480df-aed9-4266-802c-d217699bd9ad-kube-api-access-897k9\") pod \"openshift-controller-manager-operator-756b6f6bc6-g4km7\" (UID: \"7c1480df-aed9-4266-802c-d217699bd9ad\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.937768 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.938171 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.438123455 +0000 UTC m=+145.127742416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.938408 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-certs\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.938746 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-policies\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.938818 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.938959 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939020 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939118 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-config\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939190 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8025c62a-fb3f-4566-8fd6-571ddb400da3-metrics-tls\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:14 crc kubenswrapper[4878]: E1202 18:17:14.939334 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.439313986 +0000 UTC m=+145.128932867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939395 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-encryption-config\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939491 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8025c62a-fb3f-4566-8fd6-571ddb400da3-config-volume\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939511 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-audit-policies\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939555 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-serving-cert\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939617 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5945b22-4fa1-4d4e-aa84-990d28f1e423-cert\") pod \"ingress-canary-7hvht\" (UID: \"e5945b22-4fa1-4d4e-aa84-990d28f1e423\") " pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939679 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-node-bootstrap-token\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.939699 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-client\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.940434 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.940437 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.940994 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-config\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.943899 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5945b22-4fa1-4d4e-aa84-990d28f1e423-cert\") pod \"ingress-canary-7hvht\" (UID: \"e5945b22-4fa1-4d4e-aa84-990d28f1e423\") " pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.944663 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-serving-cert\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.944213 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-encryption-config\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.944960 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-certs\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.947317 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f34c5862-2f1c-4e77-b3e2-b30852607129-node-bootstrap-token\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.947614 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9m9\" (UniqueName: \"kubernetes.io/projected/3a9d4a2c-13b3-458c-92e8-058e9f1206dd-kube-api-access-9j9m9\") pod \"cluster-samples-operator-665b6dd947-t7x7s\" (UID: \"3a9d4a2c-13b3-458c-92e8-058e9f1206dd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.947940 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f45d88bc-0c44-4669-8394-081c7e4a8035-etcd-client\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.965915 4878 request.go:700] Waited for 1.980850118s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.966679 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-447rt\" (UniqueName: \"kubernetes.io/projected/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-kube-api-access-447rt\") pod \"console-f9d7485db-w6mf2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:14 crc kubenswrapper[4878]: I1202 18:17:14.996451 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4zlk\" (UniqueName: \"kubernetes.io/projected/ea820fcf-7d34-4381-bafa-cbc53d3f7c86-kube-api-access-n4zlk\") pod \"machine-api-operator-5694c8668f-ngj62\" (UID: \"ea820fcf-7d34-4381-bafa-cbc53d3f7c86\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.009998 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkzsx\" (UniqueName: \"kubernetes.io/projected/0e8bdc92-b1ec-4300-8895-fc8e804455da-kube-api-access-zkzsx\") pod \"openshift-config-operator-7777fb866f-zqt5l\" (UID: \"0e8bdc92-b1ec-4300-8895-fc8e804455da\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.015113 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.029501 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.037551 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2249a89f-5c41-4f17-a308-fe106e91ece9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bw4k8\" (UID: \"2249a89f-5c41-4f17-a308-fe106e91ece9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.040894 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.041094 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.541061549 +0000 UTC m=+145.230680440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.041659 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.042118 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.542098035 +0000 UTC m=+145.231716916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.051271 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsph4\" (UniqueName: \"kubernetes.io/projected/6951ee33-1345-4ae2-906d-fdc7cea4dc64-kube-api-access-fsph4\") pod \"apiserver-76f77b778f-ffktl\" (UID: \"6951ee33-1345-4ae2-906d-fdc7cea4dc64\") " pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.052642 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.065478 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.070900 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bszbt\" (UniqueName: \"kubernetes.io/projected/44e99cf7-e103-451a-91cc-0d610e4190a9-kube-api-access-bszbt\") pod \"downloads-7954f5f757-zbpll\" (UID: \"44e99cf7-e103-451a-91cc-0d610e4190a9\") " pod="openshift-console/downloads-7954f5f757-zbpll" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.073726 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.084635 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zbpll" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.086498 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4fw\" (UniqueName: \"kubernetes.io/projected/bdf471eb-c41c-40c0-b038-5ba883d2154a-kube-api-access-sg4fw\") pod \"route-controller-manager-6576b87f9c-l8z4g\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.105145 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwnv9\" (UniqueName: \"kubernetes.io/projected/08b0e33c-5604-46a5-b0fe-139feb379df8-kube-api-access-nwnv9\") pod \"console-operator-58897d9998-wdjfj\" (UID: \"08b0e33c-5604-46a5-b0fe-139feb379df8\") " pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.128484 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54w2d\" (UniqueName: \"kubernetes.io/projected/c699a1f9-6c07-4c4d-8605-3988308d6914-kube-api-access-54w2d\") pod \"machine-approver-56656f9798-w52jf\" (UID: \"c699a1f9-6c07-4c4d-8605-3988308d6914\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.129059 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.135835 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8025c62a-fb3f-4566-8fd6-571ddb400da3-metrics-tls\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.142443 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.142597 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.642573955 +0000 UTC m=+145.332192836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.143049 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.143654 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.643620341 +0000 UTC m=+145.333239222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.149844 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.168902 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.171520 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8025c62a-fb3f-4566-8fd6-571ddb400da3-config-volume\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.175053 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.194272 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.206495 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smfh9\" (UniqueName: \"kubernetes.io/projected/4bcb33a3-bcba-4acb-872e-e1676fdaa584-kube-api-access-smfh9\") pod \"authentication-operator-69f744f599-wkwj9\" (UID: \"4bcb33a3-bcba-4acb-872e-e1676fdaa584\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.211769 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.224108 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwwfx\" (UniqueName: \"kubernetes.io/projected/553264d0-196d-4174-816e-ba4803d6a893-kube-api-access-zwwfx\") pod \"openshift-apiserver-operator-796bbdcf4f-rmqxk\" (UID: \"553264d0-196d-4174-816e-ba4803d6a893\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.247791 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.248777 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.74875024 +0000 UTC m=+145.438369121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.249028 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.249452 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.749434504 +0000 UTC m=+145.439053385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.261049 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12392f4b-d1c0-4c2d-875a-cbebc20ae6d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r5vnm\" (UID: \"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.262630 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.272461 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4gg7\" (UniqueName: \"kubernetes.io/projected/435af086-d5fb-4f55-9c52-bfab176ee753-kube-api-access-j4gg7\") pod \"controller-manager-879f6c89f-fvq9v\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.283212 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgcqx\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-kube-api-access-jgcqx\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.310969 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.312172 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2x2m\" (UniqueName: \"kubernetes.io/projected/f324597d-0fdb-49dd-aaa2-9f71b2759bcb-kube-api-access-p2x2m\") pod \"dns-operator-744455d44c-s5t58\" (UID: \"f324597d-0fdb-49dd-aaa2-9f71b2759bcb\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.332272 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-bound-sa-token\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.342247 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.350389 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.350963 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7"] Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.351017 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.850999902 +0000 UTC m=+145.540618783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.351428 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kswcs\" (UniqueName: \"kubernetes.io/projected/883f2381-011d-410d-84cd-7ed27e099ebf-kube-api-access-kswcs\") pod \"machine-config-controller-84d6567774-wpxp8\" (UID: \"883f2381-011d-410d-84cd-7ed27e099ebf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.371359 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vg5k\" (UniqueName: \"kubernetes.io/projected/4c0a71e8-b6e5-44c8-a813-9177684ab97e-kube-api-access-4vg5k\") pod \"router-default-5444994796-28qqz\" (UID: \"4c0a71e8-b6e5-44c8-a813-9177684ab97e\") " pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.395573 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ptm\" (UniqueName: \"kubernetes.io/projected/cbe20149-0391-43f7-957b-ad3b18f54736-kube-api-access-h5ptm\") pod \"packageserver-d55dfcdfc-qprtc\" (UID: \"cbe20149-0391-43f7-957b-ad3b18f54736\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.424641 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9jb\" (UniqueName: \"kubernetes.io/projected/d2265b6f-d33f-4bea-9934-1ece55c51d35-kube-api-access-zb9jb\") pod \"olm-operator-6b444d44fb-m6f2h\" (UID: \"d2265b6f-d33f-4bea-9934-1ece55c51d35\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.426440 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhjdc\" (UniqueName: \"kubernetes.io/projected/1ba68bd9-d567-48e5-b296-0a95fdf406b5-kube-api-access-vhjdc\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.449388 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmdjj\" (UniqueName: \"kubernetes.io/projected/14702b8a-c457-4730-919c-7340aea9738e-kube-api-access-nmdjj\") pod \"kube-storage-version-migrator-operator-b67b599dd-zvddh\" (UID: \"14702b8a-c457-4730-919c-7340aea9738e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.450045 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.452261 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.452770 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:15.952753485 +0000 UTC m=+145.642372596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.463572 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.466219 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzx7r\" (UniqueName: \"kubernetes.io/projected/6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab-kube-api-access-wzx7r\") pod \"package-server-manager-789f6589d5-smztw\" (UID: \"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.469043 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.484967 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25f1a12b-b1f6-49b1-ade9-018684cdd6f3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-tjwcf\" (UID: \"25f1a12b-b1f6-49b1-ade9-018684cdd6f3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.501993 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.510417 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkmws\" (UniqueName: \"kubernetes.io/projected/38d2716d-f5de-4242-a170-624490092b98-kube-api-access-fkmws\") pod \"marketplace-operator-79b997595-5p678\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.514070 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.520481 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.526023 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ghd9\" (UniqueName: \"kubernetes.io/projected/fe86a337-8c96-4b07-b3b6-a97315fa1029-kube-api-access-8ghd9\") pod \"multus-admission-controller-857f4d67dd-9vv2f\" (UID: \"fe86a337-8c96-4b07-b3b6-a97315fa1029\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.536577 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.541782 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.546742 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03b22451-6028-4e14-bd65-30d12b0242b3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zwtvf\" (UID: \"03b22451-6028-4e14-bd65-30d12b0242b3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.553348 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.553479 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.053441582 +0000 UTC m=+145.743060473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.553840 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.554327 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.054302682 +0000 UTC m=+145.743921573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.559667 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.564090 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsnm\" (UniqueName: \"kubernetes.io/projected/b32f0172-888e-4c74-9e70-becd273d49d8-kube-api-access-2hsnm\") pod \"etcd-operator-b45778765-fbkfc\" (UID: \"b32f0172-888e-4c74-9e70-becd273d49d8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.567189 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.583265 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5tgz\" (UniqueName: \"kubernetes.io/projected/f34c5862-2f1c-4e77-b3e2-b30852607129-kube-api-access-n5tgz\") pod \"machine-config-server-9zqsp\" (UID: \"f34c5862-2f1c-4e77-b3e2-b30852607129\") " pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.593858 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.601182 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.606257 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9878\" (UniqueName: \"kubernetes.io/projected/e5945b22-4fa1-4d4e-aa84-990d28f1e423-kube-api-access-f9878\") pod \"ingress-canary-7hvht\" (UID: \"e5945b22-4fa1-4d4e-aa84-990d28f1e423\") " pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.610822 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.623330 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.629096 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4ht8\" (UniqueName: \"kubernetes.io/projected/298c0078-6d02-4963-a1fb-0f6713e6d369-kube-api-access-h4ht8\") pod \"collect-profiles-29411655-62mmf\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.629685 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.641461 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.650379 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggg7z\" (UniqueName: \"kubernetes.io/projected/8025c62a-fb3f-4566-8fd6-571ddb400da3-kube-api-access-ggg7z\") pod \"dns-default-dmpvh\" (UID: \"8025c62a-fb3f-4566-8fd6-571ddb400da3\") " pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.655091 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.655264 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.155213827 +0000 UTC m=+145.844832708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.655587 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.655971 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.155959663 +0000 UTC m=+145.845578544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.670211 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn952\" (UniqueName: \"kubernetes.io/projected/cc0cb11b-bb32-413e-bbc3-179943e19f88-kube-api-access-dn952\") pod \"catalog-operator-68c6474976-6c7h4\" (UID: \"cc0cb11b-bb32-413e-bbc3-179943e19f88\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.690575 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvrhn"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.691752 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ngj62"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.694624 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ba68bd9-d567-48e5-b296-0a95fdf406b5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mtlxg\" (UID: \"1ba68bd9-d567-48e5-b296-0a95fdf406b5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.706273 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zbpll"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.712224 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7hvht" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.712785 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npjc8\" (UniqueName: \"kubernetes.io/projected/a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b-kube-api-access-npjc8\") pod \"control-plane-machine-set-operator-78cbb6b69f-b6fbw\" (UID: \"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.723935 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn48b\" (UniqueName: \"kubernetes.io/projected/3b6800a2-0b99-4c3b-9e5e-833a245bd7be-kube-api-access-kn48b\") pod \"migrator-59844c95c7-dtnr7\" (UID: \"3b6800a2-0b99-4c3b-9e5e-833a245bd7be\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.728487 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffktl"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.742726 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9zqsp" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.744725 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.746798 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw8z6\" (UniqueName: \"kubernetes.io/projected/bfed419b-bf76-4045-9056-50ca33e1686b-kube-api-access-xw8z6\") pod \"service-ca-9c57cc56f-th582\" (UID: \"bfed419b-bf76-4045-9056-50ca33e1686b\") " pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.757260 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.757651 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.257635814 +0000 UTC m=+145.947254695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.767740 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blfmq\" (UniqueName: \"kubernetes.io/projected/37ed9429-a67d-4168-b5e9-211eddd1abb1-kube-api-access-blfmq\") pod \"csi-hostpathplugin-f59ml\" (UID: \"37ed9429-a67d-4168-b5e9-211eddd1abb1\") " pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.782609 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wdjfj"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.788218 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6gjt\" (UniqueName: \"kubernetes.io/projected/54ef3152-6020-4822-ab68-b4c3ba44dc1c-kube-api-access-d6gjt\") pod \"machine-config-operator-74547568cd-d2f8f\" (UID: \"54ef3152-6020-4822-ab68-b4c3ba44dc1c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.796211 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.805586 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fst5\" (UniqueName: \"kubernetes.io/projected/f45d88bc-0c44-4669-8394-081c7e4a8035-kube-api-access-6fst5\") pod \"apiserver-7bbb656c7d-fvb6t\" (UID: \"f45d88bc-0c44-4669-8394-081c7e4a8035\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.806437 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.822566 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sw9s\" (UniqueName: \"kubernetes.io/projected/f047f11c-e08f-47b0-965b-0cc2e8d2bd38-kube-api-access-4sw9s\") pod \"service-ca-operator-777779d784-bv27h\" (UID: \"f047f11c-e08f-47b0-965b-0cc2e8d2bd38\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.833572 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.845622 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.852087 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.852520 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.858497 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.858823 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.358809078 +0000 UTC m=+146.048427959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.870829 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.876954 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.885370 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.890006 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wkwj9"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.890054 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fvq9v"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.907124 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-w6mf2"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.922020 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g"] Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.930685 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.943227 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.949897 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-th582" Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.959114 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.959270 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.459224786 +0000 UTC m=+146.148843667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.959389 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:15 crc kubenswrapper[4878]: E1202 18:17:15.959714 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.459703643 +0000 UTC m=+146.149322534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:15 crc kubenswrapper[4878]: I1202 18:17:15.983775 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.009106 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.039749 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" Dec 02 18:17:16 crc kubenswrapper[4878]: W1202 18:17:16.042301 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2249a89f_5c41_4f17_a308_fe106e91ece9.slice/crio-0c65c7ef9db50029cad94bc0f84b406396b5df3e3b5f261e65d19a6635ff102e WatchSource:0}: Error finding container 0c65c7ef9db50029cad94bc0f84b406396b5df3e3b5f261e65d19a6635ff102e: Status 404 returned error can't find the container with id 0c65c7ef9db50029cad94bc0f84b406396b5df3e3b5f261e65d19a6635ff102e Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.045763 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" event={"ID":"c699a1f9-6c07-4c4d-8605-3988308d6914","Type":"ContainerStarted","Data":"47ba6418b03286141b518c0bb77488664bb0127d5c6b4f43e179f6b1e637a09f"} Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.046743 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" event={"ID":"7c1480df-aed9-4266-802c-d217699bd9ad","Type":"ContainerStarted","Data":"e7f8845cf9184b2c25c0af4bba8a629a20e2ecfdc2acd291a42e695e93ba79a0"} Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.061476 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.061827 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.561780648 +0000 UTC m=+146.251399529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: W1202 18:17:16.072100 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6951ee33_1345_4ae2_906d_fdc7cea4dc64.slice/crio-25c723e3883edacb1046338c5a1fb9b89a4f51e95a08bbceece612a5d473e3c4 WatchSource:0}: Error finding container 25c723e3883edacb1046338c5a1fb9b89a4f51e95a08bbceece612a5d473e3c4: Status 404 returned error can't find the container with id 25c723e3883edacb1046338c5a1fb9b89a4f51e95a08bbceece612a5d473e3c4 Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.130640 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.146951 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.159320 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.163269 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.163949 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.663926895 +0000 UTC m=+146.353545966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.187400 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s5t58"] Dec 02 18:17:16 crc kubenswrapper[4878]: W1202 18:17:16.217572 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12392f4b_d1c0_4c2d_875a_cbebc20ae6d7.slice/crio-0dfe0211deb8d07d8b33ae1f2bf9922b7218ea5337d4050ca10ef70879794322 WatchSource:0}: Error finding container 0dfe0211deb8d07d8b33ae1f2bf9922b7218ea5337d4050ca10ef70879794322: Status 404 returned error can't find the container with id 0dfe0211deb8d07d8b33ae1f2bf9922b7218ea5337d4050ca10ef70879794322 Dec 02 18:17:16 crc kubenswrapper[4878]: W1202 18:17:16.222819 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0a71e8_b6e5_44c8_a813_9177684ab97e.slice/crio-38eca1de8089d80ec0b01a667338d33b3f6648023d54c8dee379456b1bc58f47 WatchSource:0}: Error finding container 38eca1de8089d80ec0b01a667338d33b3f6648023d54c8dee379456b1bc58f47: Status 404 returned error can't find the container with id 38eca1de8089d80ec0b01a667338d33b3f6648023d54c8dee379456b1bc58f47 Dec 02 18:17:16 crc kubenswrapper[4878]: W1202 18:17:16.260454 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod553264d0_196d_4174_816e_ba4803d6a893.slice/crio-e4cd9a0a7b87c3eac93423a5feda8c6e3e573f9fd4d0eb6fcf8cabe93e46affc WatchSource:0}: Error finding container e4cd9a0a7b87c3eac93423a5feda8c6e3e573f9fd4d0eb6fcf8cabe93e46affc: Status 404 returned error can't find the container with id e4cd9a0a7b87c3eac93423a5feda8c6e3e573f9fd4d0eb6fcf8cabe93e46affc Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.263969 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.264488 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.764471728 +0000 UTC m=+146.454090609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: W1202 18:17:16.265067 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf324597d_0fdb_49dd_aaa2_9f71b2759bcb.slice/crio-aad769f6909524f785daf18ba9b0b323716c6e01fa53f494f5e7513fc78484b0 WatchSource:0}: Error finding container aad769f6909524f785daf18ba9b0b323716c6e01fa53f494f5e7513fc78484b0: Status 404 returned error can't find the container with id aad769f6909524f785daf18ba9b0b323716c6e01fa53f494f5e7513fc78484b0 Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.349740 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dmpvh"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.367055 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.367599 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.867587379 +0000 UTC m=+146.557206260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.455671 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.471290 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.471792 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.971758675 +0000 UTC m=+146.661377596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.471982 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.472527 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:16.972508541 +0000 UTC m=+146.662127582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.574271 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.575262 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.075220387 +0000 UTC m=+146.764839268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.677954 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.678329 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.178314268 +0000 UTC m=+146.867933149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.778873 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.779327 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.279303945 +0000 UTC m=+146.968922826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.880343 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.880722 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.380709647 +0000 UTC m=+147.070328528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.913518 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.917864 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7hvht"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.919577 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw"] Dec 02 18:17:16 crc kubenswrapper[4878]: I1202 18:17:16.981095 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:16 crc kubenswrapper[4878]: E1202 18:17:16.981661 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.481640443 +0000 UTC m=+147.171259324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.061350 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" event={"ID":"3a9d4a2c-13b3-458c-92e8-058e9f1206dd","Type":"ContainerStarted","Data":"baf3597340dc26b2bb0f62e0c939aa19b27e7d224f7772080538bf0f4292b217"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.062215 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" event={"ID":"553264d0-196d-4174-816e-ba4803d6a893","Type":"ContainerStarted","Data":"e4cd9a0a7b87c3eac93423a5feda8c6e3e573f9fd4d0eb6fcf8cabe93e46affc"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.063042 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" event={"ID":"08b0e33c-5604-46a5-b0fe-139feb379df8","Type":"ContainerStarted","Data":"a1727c37b0fab3e7b357ed11efa272eb47ae2a6664a1af3a58af8159e530c14e"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.063934 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" event={"ID":"0e8bdc92-b1ec-4300-8895-fc8e804455da","Type":"ContainerStarted","Data":"07b5750bc99f7a35fb005a74d5ffd7f264d11b74b9fe9e9ffdd287fe5cfc1840"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.064762 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" event={"ID":"2249a89f-5c41-4f17-a308-fe106e91ece9","Type":"ContainerStarted","Data":"0c65c7ef9db50029cad94bc0f84b406396b5df3e3b5f261e65d19a6635ff102e"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.065574 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" event={"ID":"435af086-d5fb-4f55-9c52-bfab176ee753","Type":"ContainerStarted","Data":"0f90b3743bbf5489db2b37a30cae5a59611feb48fe7412fba22fca819cd9957b"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.066196 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" event={"ID":"d2265b6f-d33f-4bea-9934-1ece55c51d35","Type":"ContainerStarted","Data":"beeadfc4258e2f98106e99d95a572d4b29a9dd1d2bc805b06687df4145dd3b50"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.067008 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" event={"ID":"bdf471eb-c41c-40c0-b038-5ba883d2154a","Type":"ContainerStarted","Data":"9847c6b9b3b4dd4ebf1efacb725ef7a7796166f8f32ca9e155986dab0e4287e7"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.072032 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-28qqz" event={"ID":"4c0a71e8-b6e5-44c8-a813-9177684ab97e","Type":"ContainerStarted","Data":"38eca1de8089d80ec0b01a667338d33b3f6648023d54c8dee379456b1bc58f47"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.073463 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dmpvh" event={"ID":"8025c62a-fb3f-4566-8fd6-571ddb400da3","Type":"ContainerStarted","Data":"5a432fb99a6223e84545809b86399e1aa03df9ec1abe4995abb73c4e99475005"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.074278 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" event={"ID":"6951ee33-1345-4ae2-906d-fdc7cea4dc64","Type":"ContainerStarted","Data":"25c723e3883edacb1046338c5a1fb9b89a4f51e95a08bbceece612a5d473e3c4"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.077030 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" event={"ID":"7c1480df-aed9-4266-802c-d217699bd9ad","Type":"ContainerStarted","Data":"c06e3109d5291e982932ebdf32aa2e17ddde49dfe6812cc697b93417f84fed96"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.079303 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.080760 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zbpll" event={"ID":"44e99cf7-e103-451a-91cc-0d610e4190a9","Type":"ContainerStarted","Data":"31ef2c65cfebe96d88409e9e2338405f4945dde48d49044b3291d89dcb4c37e2"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.083832 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.084324 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.584308299 +0000 UTC m=+147.273927180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.086354 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" event={"ID":"ef66151f-c39c-4f4d-bbc4-8c86555e41ea","Type":"ContainerStarted","Data":"f69bc6b23a513da467436ad21857c408d3f92279f6e92f9d2a369f02d4c2445a"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.087869 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.088827 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" event={"ID":"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7","Type":"ContainerStarted","Data":"0dfe0211deb8d07d8b33ae1f2bf9922b7218ea5337d4050ca10ef70879794322"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.092982 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9vv2f"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.093182 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" event={"ID":"ea820fcf-7d34-4381-bafa-cbc53d3f7c86","Type":"ContainerStarted","Data":"1c13245182598e212e416d132e11dbedefc3ef5d2f30d47577b68ca859fa5e7b"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.098550 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5p678"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.100151 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" event={"ID":"4bcb33a3-bcba-4acb-872e-e1676fdaa584","Type":"ContainerStarted","Data":"989113d64dc93071e68a1bb149f871db6310056ee46ba801fffca59da8e7ca7a"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.103119 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" event={"ID":"c699a1f9-6c07-4c4d-8605-3988308d6914","Type":"ContainerStarted","Data":"829e8e0d8984c2555bb578a53da96a3712c52c55038225b3c312ba3f37a3acb9"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.106454 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.109477 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" event={"ID":"f324597d-0fdb-49dd-aaa2-9f71b2759bcb","Type":"ContainerStarted","Data":"aad769f6909524f785daf18ba9b0b323716c6e01fa53f494f5e7513fc78484b0"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.111258 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w6mf2" event={"ID":"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2","Type":"ContainerStarted","Data":"634a727576962265b23fd0c54a5939f89f3444158a1a3b6b8cf693546076f0bb"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.112199 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.112617 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9zqsp" event={"ID":"f34c5862-2f1c-4e77-b3e2-b30852607129","Type":"ContainerStarted","Data":"f45b2bcff1edb78e76cacecdc5278c50fc6738eb22f03f441f8d2831613d77ae"} Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.189222 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.191157 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.691110564 +0000 UTC m=+147.380729455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.292439 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.299022 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf"] Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.301852 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.801832405 +0000 UTC m=+147.491451286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.303909 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8e39ea4_f7e1_4df7_997b_2c3923f9ad6b.slice/crio-7f06a2f30c80a682f2abe99d8ef054ccbbb9dbb07027b7f57120c1a3d45b14da WatchSource:0}: Error finding container 7f06a2f30c80a682f2abe99d8ef054ccbbb9dbb07027b7f57120c1a3d45b14da: Status 404 returned error can't find the container with id 7f06a2f30c80a682f2abe99d8ef054ccbbb9dbb07027b7f57120c1a3d45b14da Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.314166 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.319097 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.330107 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf"] Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.348055 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe20149_0391_43f7_957b_ad3b18f54736.slice/crio-dd449ea9f976b45067d8213fcb97f8c637278dc9331fb96749cba05d0a77b7a0 WatchSource:0}: Error finding container dd449ea9f976b45067d8213fcb97f8c637278dc9331fb96749cba05d0a77b7a0: Status 404 returned error can't find the container with id dd449ea9f976b45067d8213fcb97f8c637278dc9331fb96749cba05d0a77b7a0 Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.354293 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f59ml"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.359315 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.366990 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-th582"] Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.371158 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bv27h"] Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.379264 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6800a2_0b99_4c3b_9e5e_833a245bd7be.slice/crio-7450bb870ce2588dea993ee0b6cc6ea88f764237219e5e9505f0b7b088d5dbce WatchSource:0}: Error finding container 7450bb870ce2588dea993ee0b6cc6ea88f764237219e5e9505f0b7b088d5dbce: Status 404 returned error can't find the container with id 7450bb870ce2588dea993ee0b6cc6ea88f764237219e5e9505f0b7b088d5dbce Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.382215 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe86a337_8c96_4b07_b3b6_a97315fa1029.slice/crio-db10d9e9a19c788fcbf231894b51d732bfc1a329d67caf2f3d8753b8a544a42e WatchSource:0}: Error finding container db10d9e9a19c788fcbf231894b51d732bfc1a329d67caf2f3d8753b8a544a42e: Status 404 returned error can't find the container with id db10d9e9a19c788fcbf231894b51d732bfc1a329d67caf2f3d8753b8a544a42e Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.387407 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d2716d_f5de_4242_a170_624490092b98.slice/crio-7a41ee2a2168b5f3f054c8cefb238051568904083b38278b2ee90dae542cf3c8 WatchSource:0}: Error finding container 7a41ee2a2168b5f3f054c8cefb238051568904083b38278b2ee90dae542cf3c8: Status 404 returned error can't find the container with id 7a41ee2a2168b5f3f054c8cefb238051568904083b38278b2ee90dae542cf3c8 Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.393735 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.395608 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.895578615 +0000 UTC m=+147.585197496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.399770 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t"] Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.411870 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14702b8a_c457_4730_919c_7340aea9738e.slice/crio-bccdb8388811e7fb2fa2e9a12e90cdd932ef0efef982de4c27e8fe3d955ecb9d WatchSource:0}: Error finding container bccdb8388811e7fb2fa2e9a12e90cdd932ef0efef982de4c27e8fe3d955ecb9d: Status 404 returned error can't find the container with id bccdb8388811e7fb2fa2e9a12e90cdd932ef0efef982de4c27e8fe3d955ecb9d Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.417463 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbkfc"] Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.470005 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod298c0078_6d02_4963_a1fb_0f6713e6d369.slice/crio-49974d191188131dd988f2557e822cab1bd73b46372259d9a025ac0ac999b39b WatchSource:0}: Error finding container 49974d191188131dd988f2557e822cab1bd73b46372259d9a025ac0ac999b39b: Status 404 returned error can't find the container with id 49974d191188131dd988f2557e822cab1bd73b46372259d9a025ac0ac999b39b Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.496869 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.497614 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:17.997596859 +0000 UTC m=+147.687215740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.503387 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ef3152_6020_4822_ab68_b4c3ba44dc1c.slice/crio-7807282fd64c5733c3d05801189142bb4929565a980e5e505c875c2b39862b8e WatchSource:0}: Error finding container 7807282fd64c5733c3d05801189142bb4929565a980e5e505c875c2b39862b8e: Status 404 returned error can't find the container with id 7807282fd64c5733c3d05801189142bb4929565a980e5e505c875c2b39862b8e Dec 02 18:17:17 crc kubenswrapper[4878]: W1202 18:17:17.539093 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfed419b_bf76_4045_9056_50ca33e1686b.slice/crio-c3e5698ce97a6d9661aeffaa7b6ad7a45b6283ef72cc177dbc571718e81fe219 WatchSource:0}: Error finding container c3e5698ce97a6d9661aeffaa7b6ad7a45b6283ef72cc177dbc571718e81fe219: Status 404 returned error can't find the container with id c3e5698ce97a6d9661aeffaa7b6ad7a45b6283ef72cc177dbc571718e81fe219 Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.598027 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.598502 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.098480463 +0000 UTC m=+147.788099344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.700345 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.700706 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.200692662 +0000 UTC m=+147.890311543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.802758 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.802946 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.302917493 +0000 UTC m=+147.992536374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.803468 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.803891 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.303876626 +0000 UTC m=+147.993495507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.905184 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.905371 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.405343599 +0000 UTC m=+148.094962480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:17 crc kubenswrapper[4878]: I1202 18:17:17.906345 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:17 crc kubenswrapper[4878]: E1202 18:17:17.906673 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.406661895 +0000 UTC m=+148.096280776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.007612 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.007950 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.507910272 +0000 UTC m=+148.197529153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.008636 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.009091 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.509073941 +0000 UTC m=+148.198692822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.109616 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.110084 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.610068139 +0000 UTC m=+148.299687020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.138681 4878 generic.go:334] "Generic (PLEG): container finished" podID="0e8bdc92-b1ec-4300-8895-fc8e804455da" containerID="09acade865eb2363ac0e80173433dfd512e651eaade52f2f9c211eda415af404" exitCode=0 Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.138757 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" event={"ID":"0e8bdc92-b1ec-4300-8895-fc8e804455da","Type":"ContainerDied","Data":"09acade865eb2363ac0e80173433dfd512e651eaade52f2f9c211eda415af404"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.158408 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-g4km7" podStartSLOduration=124.158378799 podStartE2EDuration="2m4.158378799s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:17.486100436 +0000 UTC m=+147.175719327" watchObservedRunningTime="2025-12-02 18:17:18.158378799 +0000 UTC m=+147.847997680" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.159713 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" event={"ID":"bdf471eb-c41c-40c0-b038-5ba883d2154a","Type":"ContainerStarted","Data":"94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.159917 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.170693 4878 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-l8z4g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.170750 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" podUID="bdf471eb-c41c-40c0-b038-5ba883d2154a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.177428 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" event={"ID":"4bcb33a3-bcba-4acb-872e-e1676fdaa584","Type":"ContainerStarted","Data":"b0ddbf4ca021d9508dc1e3aa0be78df93de055a896bb579726e82ba2e71e3ae8"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.192863 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" event={"ID":"2249a89f-5c41-4f17-a308-fe106e91ece9","Type":"ContainerStarted","Data":"d536bf7479b57231e7b6ea94074d67ce561a27002ade8fcff5044fbd0e17ef08"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.195189 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" event={"ID":"14702b8a-c457-4730-919c-7340aea9738e","Type":"ContainerStarted","Data":"bccdb8388811e7fb2fa2e9a12e90cdd932ef0efef982de4c27e8fe3d955ecb9d"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.204112 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-28qqz" event={"ID":"4c0a71e8-b6e5-44c8-a813-9177684ab97e","Type":"ContainerStarted","Data":"9b74b4847c82087625cc8ff629ee61f73ace40474702c6fffd82a7b9c1310f4c"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.207588 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9zqsp" event={"ID":"f34c5862-2f1c-4e77-b3e2-b30852607129","Type":"ContainerStarted","Data":"9b59710d5bf5533398b566de1da2182eb80ea311e04dd9e13e3d6a6576bf65ce"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.210805 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.213096 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.713078766 +0000 UTC m=+148.402697647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.213962 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" podStartSLOduration=123.213948086 podStartE2EDuration="2m3.213948086s" podCreationTimestamp="2025-12-02 18:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.188782867 +0000 UTC m=+147.878401768" watchObservedRunningTime="2025-12-02 18:17:18.213948086 +0000 UTC m=+147.903566967" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.215108 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" event={"ID":"08b0e33c-5604-46a5-b0fe-139feb379df8","Type":"ContainerStarted","Data":"666bedaf6566fe6802b1fac172487b91e0a0be097e781e4853af4b08f7787bde"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.215621 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.225682 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wkwj9" podStartSLOduration=124.225657586 podStartE2EDuration="2m4.225657586s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.213648606 +0000 UTC m=+147.903267487" watchObservedRunningTime="2025-12-02 18:17:18.225657586 +0000 UTC m=+147.915276467" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.226410 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" event={"ID":"12392f4b-d1c0-4c2d-875a-cbebc20ae6d7","Type":"ContainerStarted","Data":"08ac6c2fb2127b99443838a0fa79be376ac012fe00388c6e98ba5be27f9e22a6"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.234770 4878 patch_prober.go:28] interesting pod/console-operator-58897d9998-wdjfj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.234842 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" podUID="08b0e33c-5604-46a5-b0fe-139feb379df8" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.243191 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bw4k8" podStartSLOduration=124.243176484 podStartE2EDuration="2m4.243176484s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.242391267 +0000 UTC m=+147.932010148" watchObservedRunningTime="2025-12-02 18:17:18.243176484 +0000 UTC m=+147.932795365" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.260946 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-th582" event={"ID":"bfed419b-bf76-4045-9056-50ca33e1686b","Type":"ContainerStarted","Data":"c3e5698ce97a6d9661aeffaa7b6ad7a45b6283ef72cc177dbc571718e81fe219"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.308379 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9zqsp" podStartSLOduration=6.308355539 podStartE2EDuration="6.308355539s" podCreationTimestamp="2025-12-02 18:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.288948216 +0000 UTC m=+147.978567097" watchObservedRunningTime="2025-12-02 18:17:18.308355539 +0000 UTC m=+147.997974420" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.312118 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.312594 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.812535552 +0000 UTC m=+148.502154423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.317632 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.318542 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.818514226 +0000 UTC m=+148.508133107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.332373 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-28qqz" podStartSLOduration=124.332341578 podStartE2EDuration="2m4.332341578s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.331045053 +0000 UTC m=+148.020663934" watchObservedRunningTime="2025-12-02 18:17:18.332341578 +0000 UTC m=+148.021960459" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.384864 4878 generic.go:334] "Generic (PLEG): container finished" podID="6951ee33-1345-4ae2-906d-fdc7cea4dc64" containerID="26dfaac495bcbf001cb66a8796fd0e79ddc3f747d6912e69249df2dea72b2e4d" exitCode=0 Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.385133 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" event={"ID":"6951ee33-1345-4ae2-906d-fdc7cea4dc64","Type":"ContainerDied","Data":"26dfaac495bcbf001cb66a8796fd0e79ddc3f747d6912e69249df2dea72b2e4d"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.389613 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" podStartSLOduration=124.389581842 podStartE2EDuration="2m4.389581842s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.389015953 +0000 UTC m=+148.078634834" watchObservedRunningTime="2025-12-02 18:17:18.389581842 +0000 UTC m=+148.079200723" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.421862 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.422854 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:18.922829297 +0000 UTC m=+148.612448178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.426107 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" event={"ID":"1ba68bd9-d567-48e5-b296-0a95fdf406b5","Type":"ContainerStarted","Data":"0f1c1cf3beb066ac2b30075f364c2348328e010119fe346f11cc5c6f635920eb"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.434556 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5vnm" podStartSLOduration=124.434536087 podStartE2EDuration="2m4.434536087s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.434090771 +0000 UTC m=+148.123709662" watchObservedRunningTime="2025-12-02 18:17:18.434536087 +0000 UTC m=+148.124154968" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.487380 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" event={"ID":"c699a1f9-6c07-4c4d-8605-3988308d6914","Type":"ContainerStarted","Data":"df99a2dddbdd4bf5b06c4b027439da049cb8d012d39eba7a6a8ec081df335111"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.524512 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.526300 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.026280519 +0000 UTC m=+148.715899400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.526368 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" event={"ID":"3a9d4a2c-13b3-458c-92e8-058e9f1206dd","Type":"ContainerStarted","Data":"6b5d14b00f6a9036efab89fbd98ddb9b2f24c940ea08b2e0cf138f5bdbd2e7ac"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.547641 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" event={"ID":"883f2381-011d-410d-84cd-7ed27e099ebf","Type":"ContainerStarted","Data":"c16af3572532fe25b1f63aa380d421bc5fe9eac070e09fac7261f63c22723083"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.557085 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" event={"ID":"553264d0-196d-4174-816e-ba4803d6a893","Type":"ContainerStarted","Data":"48d61ca5f36fa83252522851b99953dc31ae81b36cfb5cdd7ed1e381d9fb3b25"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.584506 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" event={"ID":"cbe20149-0391-43f7-957b-ad3b18f54736","Type":"ContainerStarted","Data":"dd449ea9f976b45067d8213fcb97f8c637278dc9331fb96749cba05d0a77b7a0"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.587388 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" event={"ID":"fe86a337-8c96-4b07-b3b6-a97315fa1029","Type":"ContainerStarted","Data":"db10d9e9a19c788fcbf231894b51d732bfc1a329d67caf2f3d8753b8a544a42e"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.596618 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w52jf" podStartSLOduration=124.59659794 podStartE2EDuration="2m4.59659794s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.524021842 +0000 UTC m=+148.213640733" watchObservedRunningTime="2025-12-02 18:17:18.59659794 +0000 UTC m=+148.286216831" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.596888 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rmqxk" podStartSLOduration=124.596883379 podStartE2EDuration="2m4.596883379s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.59482751 +0000 UTC m=+148.284446391" watchObservedRunningTime="2025-12-02 18:17:18.596883379 +0000 UTC m=+148.286502260" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.616344 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" event={"ID":"435af086-d5fb-4f55-9c52-bfab176ee753","Type":"ContainerStarted","Data":"8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.616853 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.624895 4878 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fvq9v container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.624973 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" podUID="435af086-d5fb-4f55-9c52-bfab176ee753" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.625872 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.626207 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.126189931 +0000 UTC m=+148.815808812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.626347 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.627481 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.127462293 +0000 UTC m=+148.817081174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.635534 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" event={"ID":"38d2716d-f5de-4242-a170-624490092b98","Type":"ContainerStarted","Data":"65416adb2286d96d2e3dde6449c968fd6c8744a0af2ff339a7a5f9ac81c70a61"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.635577 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" event={"ID":"38d2716d-f5de-4242-a170-624490092b98","Type":"ContainerStarted","Data":"7a41ee2a2168b5f3f054c8cefb238051568904083b38278b2ee90dae542cf3c8"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.636348 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.641783 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.642457 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" event={"ID":"f047f11c-e08f-47b0-965b-0cc2e8d2bd38","Type":"ContainerStarted","Data":"673046a467d636e21f4a1143f3e20b960b6ca25b33c87060f104e44030d4dedc"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.643709 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.643771 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.648486 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" podStartSLOduration=124.648469121 podStartE2EDuration="2m4.648469121s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.647895591 +0000 UTC m=+148.337514472" watchObservedRunningTime="2025-12-02 18:17:18.648469121 +0000 UTC m=+148.338087992" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.649868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7hvht" event={"ID":"e5945b22-4fa1-4d4e-aa84-990d28f1e423","Type":"ContainerStarted","Data":"bbc4da9789fe59ef65c3ea7f96bc03e49d5832b38991d8888570a58ef16d2beb"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.653983 4878 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5p678 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.654039 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" podUID="38d2716d-f5de-4242-a170-624490092b98" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.675783 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" event={"ID":"298c0078-6d02-4963-a1fb-0f6713e6d369","Type":"ContainerStarted","Data":"49974d191188131dd988f2557e822cab1bd73b46372259d9a025ac0ac999b39b"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.696341 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" event={"ID":"54ef3152-6020-4822-ab68-b4c3ba44dc1c","Type":"ContainerStarted","Data":"7807282fd64c5733c3d05801189142bb4929565a980e5e505c875c2b39862b8e"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.714732 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" event={"ID":"25f1a12b-b1f6-49b1-ade9-018684cdd6f3","Type":"ContainerStarted","Data":"78a01cb4d7aea47a3c1afad18e14b0aaafbb85ad3451fae1a95040e97ec2f5e7"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.739718 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" podStartSLOduration=124.739693815 podStartE2EDuration="2m4.739693815s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.690495886 +0000 UTC m=+148.380114767" watchObservedRunningTime="2025-12-02 18:17:18.739693815 +0000 UTC m=+148.429312696" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.741323 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.750622 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.250566826 +0000 UTC m=+148.940185707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.800810 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" event={"ID":"3b6800a2-0b99-4c3b-9e5e-833a245bd7be","Type":"ContainerStarted","Data":"376b1e1d73e258a0e1c2bda9afcdd3fe16b35835c83da76658cb2fd954af9e6f"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.801280 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" event={"ID":"3b6800a2-0b99-4c3b-9e5e-833a245bd7be","Type":"ContainerStarted","Data":"7450bb870ce2588dea993ee0b6cc6ea88f764237219e5e9505f0b7b088d5dbce"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.817551 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" event={"ID":"ef66151f-c39c-4f4d-bbc4-8c86555e41ea","Type":"ContainerStarted","Data":"c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.817922 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.842654 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" event={"ID":"b32f0172-888e-4c74-9e70-becd273d49d8","Type":"ContainerStarted","Data":"46c53f55928baded7c4c68a5cb83f46037f552be55a8394a8817c3562aff536f"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.856529 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.859843 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.359813226 +0000 UTC m=+149.049432287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.875892 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" podStartSLOduration=124.875863345 podStartE2EDuration="2m4.875863345s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.740484033 +0000 UTC m=+148.430102914" watchObservedRunningTime="2025-12-02 18:17:18.875863345 +0000 UTC m=+148.565482226" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.878384 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" podStartSLOduration=124.87837369 podStartE2EDuration="2m4.87837369s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.874096874 +0000 UTC m=+148.563715755" watchObservedRunningTime="2025-12-02 18:17:18.87837369 +0000 UTC m=+148.567992591" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.879559 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" event={"ID":"cc0cb11b-bb32-413e-bbc3-179943e19f88","Type":"ContainerStarted","Data":"05a75a8ac62854d43c84bbe8f9434aa67cd308279506edca5dc56398177940b9"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.882316 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.890801 4878 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-6c7h4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.890844 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" podUID="cc0cb11b-bb32-413e-bbc3-179943e19f88" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.895838 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zbpll" event={"ID":"44e99cf7-e103-451a-91cc-0d610e4190a9","Type":"ContainerStarted","Data":"569c10d200a9be30f4dcb9ae7d62b8e85e04939af6012c554bf2723c736a4ee3"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.897451 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zbpll" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.901310 4878 patch_prober.go:28] interesting pod/downloads-7954f5f757-zbpll container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.901382 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zbpll" podUID="44e99cf7-e103-451a-91cc-0d610e4190a9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.903609 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" event={"ID":"37ed9429-a67d-4168-b5e9-211eddd1abb1","Type":"ContainerStarted","Data":"36cb546546ac8c146f4222af3d4087284f676680f42290d4fe14202a91affdcd"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.929705 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" event={"ID":"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b","Type":"ContainerStarted","Data":"7f06a2f30c80a682f2abe99d8ef054ccbbb9dbb07027b7f57120c1a3d45b14da"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.946838 4878 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m6f2h container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.947422 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" podUID="d2265b6f-d33f-4bea-9934-1ece55c51d35" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.957417 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:18 crc kubenswrapper[4878]: E1202 18:17:18.962183 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.462150741 +0000 UTC m=+149.151769742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.962650 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" podStartSLOduration=124.962628877 podStartE2EDuration="2m4.962628877s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.924487114 +0000 UTC m=+148.614106005" watchObservedRunningTime="2025-12-02 18:17:18.962628877 +0000 UTC m=+148.652247758" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.964087 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.964122 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" event={"ID":"d2265b6f-d33f-4bea-9934-1ece55c51d35","Type":"ContainerStarted","Data":"6b40bb900f3701d5bb4e580ee1390302c9383ea581f39f0e522dc496b42da7b1"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.964144 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w6mf2" event={"ID":"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2","Type":"ContainerStarted","Data":"811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7"} Dec 02 18:17:18 crc kubenswrapper[4878]: I1202 18:17:18.970143 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" event={"ID":"ea820fcf-7d34-4381-bafa-cbc53d3f7c86","Type":"ContainerStarted","Data":"3a04df764396aa0aba8901cb4c7f4734db5b2f1099a9844c14f81694f4d28629"} Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.004851 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" event={"ID":"f324597d-0fdb-49dd-aaa2-9f71b2759bcb","Type":"ContainerStarted","Data":"94c95270be1edae0f782c3d3e9f4bba004440c6f777a669728865015c899680a"} Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.009269 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" podStartSLOduration=125.009224598 podStartE2EDuration="2m5.009224598s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.996355979 +0000 UTC m=+148.685974880" watchObservedRunningTime="2025-12-02 18:17:19.009224598 +0000 UTC m=+148.698843479" Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.011191 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zbpll" podStartSLOduration=125.011174334 podStartE2EDuration="2m5.011174334s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:18.962393479 +0000 UTC m=+148.652012380" watchObservedRunningTime="2025-12-02 18:17:19.011174334 +0000 UTC m=+148.700793215" Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.047601 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" event={"ID":"03b22451-6028-4e14-bd65-30d12b0242b3","Type":"ContainerStarted","Data":"3d63418c81697a9b321e53d1616580786e8a5c16efdbb880687bf70f7c8c7395"} Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.061838 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.084775 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.584742956 +0000 UTC m=+149.274361837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.096477 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" event={"ID":"f45d88bc-0c44-4669-8394-081c7e4a8035","Type":"ContainerStarted","Data":"85523e6d389b64871a83627d1b808024c94425d3e8863efce0818fecac762ebe"} Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.102057 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-w6mf2" podStartSLOduration=125.102031206 podStartE2EDuration="2m5.102031206s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:19.035306148 +0000 UTC m=+148.724925039" watchObservedRunningTime="2025-12-02 18:17:19.102031206 +0000 UTC m=+148.791650087" Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.113396 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" podStartSLOduration=125.113357673 podStartE2EDuration="2m5.113357673s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:19.101171017 +0000 UTC m=+148.790789918" watchObservedRunningTime="2025-12-02 18:17:19.113357673 +0000 UTC m=+148.802976554" Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.139507 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" podStartSLOduration=125.139477134 podStartE2EDuration="2m5.139477134s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:19.136034657 +0000 UTC m=+148.825653558" watchObservedRunningTime="2025-12-02 18:17:19.139477134 +0000 UTC m=+148.829096015" Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.153151 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" event={"ID":"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab","Type":"ContainerStarted","Data":"c5e1696e871e2660ad79ce2494be621c21e1b4eb217512f2a81582d6972883b5"} Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.166927 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.167835 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.667626855 +0000 UTC m=+149.357245736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.273158 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.273603 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.773586204 +0000 UTC m=+149.463205085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.307485 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.375825 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.376301 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.876272139 +0000 UTC m=+149.565891020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.479536 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.480166 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:19.980152666 +0000 UTC m=+149.669771547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.585942 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.586147 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.086115403 +0000 UTC m=+149.775734284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.586763 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.587156 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.087120438 +0000 UTC m=+149.776739319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.652588 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:19 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:19 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:19 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.652676 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.689903 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.690368 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.190346372 +0000 UTC m=+149.879965253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.791790 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.792483 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.292458978 +0000 UTC m=+149.982077859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.894380 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.894954 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.394918187 +0000 UTC m=+150.084537078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:19 crc kubenswrapper[4878]: I1202 18:17:19.996716 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:19 crc kubenswrapper[4878]: E1202 18:17:19.997349 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.497329153 +0000 UTC m=+150.186948034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.097977 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.098177 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.598132064 +0000 UTC m=+150.287750945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.098779 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.099262 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.599250213 +0000 UTC m=+150.288869104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.164127 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" event={"ID":"6951ee33-1345-4ae2-906d-fdc7cea4dc64","Type":"ContainerStarted","Data":"010025a3ef377ddc398c888a1ec42b93bf3fff121ede7522611af14c686e32f5"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.164254 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" event={"ID":"6951ee33-1345-4ae2-906d-fdc7cea4dc64","Type":"ContainerStarted","Data":"4ba102f0d9e229becb1640b110a483012ac7baa1439aec5169746d1468eb7658"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.171388 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" event={"ID":"03b22451-6028-4e14-bd65-30d12b0242b3","Type":"ContainerStarted","Data":"55f1f340eccc7137ed887a35b710814abb5744b54d122b70c8e582fc2c60b7af"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.174595 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" event={"ID":"cc0cb11b-bb32-413e-bbc3-179943e19f88","Type":"ContainerStarted","Data":"9bc1c7951c3b569c81c847cd4361025601c05b4d61302766db65c51f8c4de1bb"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.184058 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" event={"ID":"14702b8a-c457-4730-919c-7340aea9738e","Type":"ContainerStarted","Data":"3654cb33358bfe64cf5de60163eabadc8758be50ae82643f4c11bbe37a1aae4f"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.191087 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6c7h4" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.192278 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-th582" event={"ID":"bfed419b-bf76-4045-9056-50ca33e1686b","Type":"ContainerStarted","Data":"88351dcb73f265529934067d64d2d417b9cd9e3b69b36800eb2255162b557315"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.200175 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.200397 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.700366944 +0000 UTC m=+150.389985825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.200473 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.200860 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.700850151 +0000 UTC m=+150.390469102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.203161 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" event={"ID":"3b6800a2-0b99-4c3b-9e5e-833a245bd7be","Type":"ContainerStarted","Data":"73c4d4817c1057bfc88f3b996b0499e985e8a98d2c8b6ddd3dc1994d64a146ef"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.211308 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7hvht" event={"ID":"e5945b22-4fa1-4d4e-aa84-990d28f1e423","Type":"ContainerStarted","Data":"9ea130c3fc6ee188dd96701949924181cbe914d7a6f3b64090ca121cdc4007d9"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.219654 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" event={"ID":"298c0078-6d02-4963-a1fb-0f6713e6d369","Type":"ContainerStarted","Data":"3225532b534ec187daab9b2536a58912dece1a41bb317aa0b7d22fc707ce2393"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.235896 4878 generic.go:334] "Generic (PLEG): container finished" podID="f45d88bc-0c44-4669-8394-081c7e4a8035" containerID="fd44bc5588f31528875364ac58770138b3d2d5fa92cf796245b11d307b51b720" exitCode=0 Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.235847 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" event={"ID":"f45d88bc-0c44-4669-8394-081c7e4a8035","Type":"ContainerDied","Data":"fd44bc5588f31528875364ac58770138b3d2d5fa92cf796245b11d307b51b720"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.272039 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" podStartSLOduration=126.27199901 podStartE2EDuration="2m6.27199901s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.264516015 +0000 UTC m=+149.954134896" watchObservedRunningTime="2025-12-02 18:17:20.27199901 +0000 UTC m=+149.961617891" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.287225 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dmpvh" event={"ID":"8025c62a-fb3f-4566-8fd6-571ddb400da3","Type":"ContainerStarted","Data":"c1f20ca2644c8ddd250cc46487865d5496c60b8cc8c9d24414dba717d49494fd"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.287292 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dmpvh" event={"ID":"8025c62a-fb3f-4566-8fd6-571ddb400da3","Type":"ContainerStarted","Data":"655dd5deb108339abf1bfd92b78e45b0b2b2fa075563b7ab35a8f51068d64bcc"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.288049 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.304705 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.306123 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.806101765 +0000 UTC m=+150.495720636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.308177 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" event={"ID":"3a9d4a2c-13b3-458c-92e8-058e9f1206dd","Type":"ContainerStarted","Data":"05105ac7d622c8ebb3de5ced17ecf591931e45d7c1ee4a25fc871dedad763be8"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.315167 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.315297 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.314320 4878 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ffktl container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.315452 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" podUID="6951ee33-1345-4ae2-906d-fdc7cea4dc64" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.316330 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" event={"ID":"0e8bdc92-b1ec-4300-8895-fc8e804455da","Type":"ContainerStarted","Data":"9b709cfdab248bbc25e4aa021afccf84e91b0fd7d2f76d97ef06e861ffcc1e78"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.317020 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.326002 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" event={"ID":"1ba68bd9-d567-48e5-b296-0a95fdf406b5","Type":"ContainerStarted","Data":"13e5aa80d384a13b4436c81dc364fa8f8cd0d1ff605a44f64c43e478fb938b83"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.326057 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" event={"ID":"1ba68bd9-d567-48e5-b296-0a95fdf406b5","Type":"ContainerStarted","Data":"af8911e2d7da7c81e956a7a97b739da8e1e3aa4e7e21d2a8aa48c93905cfde5b"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.363693 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-tjwcf" event={"ID":"25f1a12b-b1f6-49b1-ade9-018684cdd6f3","Type":"ContainerStarted","Data":"c329a69a94c1431bc1833543b9169fd87a0c151adb81b7e4c38091545010e660"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.391508 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" event={"ID":"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab","Type":"ContainerStarted","Data":"6f10e0d48858194a027a671a4cc87af130e7eba8dd35304e90865b8b9c5b1b5d"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.391562 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" event={"ID":"6eb41b93-54fd-4ebe-b7ee-c8a1f1a5f6ab","Type":"ContainerStarted","Data":"c019aabfd1ee9c7bcdc095eafb0937c140fe6661bd411699b024b156b78c7c01"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.392342 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.402573 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zwtvf" podStartSLOduration=126.402546897 podStartE2EDuration="2m6.402546897s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.306361224 +0000 UTC m=+149.995980105" watchObservedRunningTime="2025-12-02 18:17:20.402546897 +0000 UTC m=+150.092165768" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.407616 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" event={"ID":"fe86a337-8c96-4b07-b3b6-a97315fa1029","Type":"ContainerStarted","Data":"ff64f15056da38f14951cc1652aef81e150a1815f648631a4858ad474fb94b1e"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.408996 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.409277 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:20.909261807 +0000 UTC m=+150.598880688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.414183 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" event={"ID":"fe86a337-8c96-4b07-b3b6-a97315fa1029","Type":"ContainerStarted","Data":"6c8ccd35976540b15654eb2a685b078a6020d51349fc4190756180ef60bff35f"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.462570 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ngj62" event={"ID":"ea820fcf-7d34-4381-bafa-cbc53d3f7c86","Type":"ContainerStarted","Data":"051e8bc3d31e58c0b9f079b729423153171a0b56b7bb8b461e8e2b34f7af91d0"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.485377 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" event={"ID":"883f2381-011d-410d-84cd-7ed27e099ebf","Type":"ContainerStarted","Data":"be7353c48f08833ab05576bcfff7dee49316a0be9a1cbab77b4215f38a3e83cc"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.485442 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" event={"ID":"883f2381-011d-410d-84cd-7ed27e099ebf","Type":"ContainerStarted","Data":"8fea27c9630cfaaf7ac2b7d1f6c5e0c0ba3fb3a42a28b2886f3a29f391275d40"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.501505 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" podStartSLOduration=126.501486215 podStartE2EDuration="2m6.501486215s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.500735549 +0000 UTC m=+150.190354430" watchObservedRunningTime="2025-12-02 18:17:20.501486215 +0000 UTC m=+150.191105096" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.517215 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.518976 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.018953632 +0000 UTC m=+150.708572513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.544260 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" event={"ID":"54ef3152-6020-4822-ab68-b4c3ba44dc1c","Type":"ContainerStarted","Data":"aea116d746bd453f32df62e32bd2529d49989f794223f9f1a7b45227ee679ad2"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.544314 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" event={"ID":"54ef3152-6020-4822-ab68-b4c3ba44dc1c","Type":"ContainerStarted","Data":"dc583b2fce1cf62b576ec81e581c20d0366ce4ae3d63756bb1aeab829753df0a"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.585686 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7hvht" podStartSLOduration=8.585662889 podStartE2EDuration="8.585662889s" podCreationTimestamp="2025-12-02 18:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.584066714 +0000 UTC m=+150.273685615" watchObservedRunningTime="2025-12-02 18:17:20.585662889 +0000 UTC m=+150.275281770" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.593534 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" event={"ID":"f047f11c-e08f-47b0-965b-0cc2e8d2bd38","Type":"ContainerStarted","Data":"46b72cca1a8de9527dc535ed485d58c9f83f980706162169b9915bb959a4732a"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.618715 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.619357 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.119342849 +0000 UTC m=+150.808961720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.621448 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" event={"ID":"cbe20149-0391-43f7-957b-ad3b18f54736","Type":"ContainerStarted","Data":"6411c7a373cbd112bad7fce17a79155865c26b2ee045adf59323668b0c9dc5f7"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.621485 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.656438 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b6fbw" event={"ID":"a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b","Type":"ContainerStarted","Data":"d4789c87f946881e65bf132f877d0a549651e6710cfe1a62964efcd7e91b1034"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.668686 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:20 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:20 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:20 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.668812 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.704575 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dtnr7" podStartSLOduration=126.704527977 podStartE2EDuration="2m6.704527977s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.699850158 +0000 UTC m=+150.389469039" watchObservedRunningTime="2025-12-02 18:17:20.704527977 +0000 UTC m=+150.394146858" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.711753 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" event={"ID":"f324597d-0fdb-49dd-aaa2-9f71b2759bcb","Type":"ContainerStarted","Data":"a59d6453e262199aaada1cbcbd157da651bb12d9267ec582a827981fe694f094"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.723974 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.725216 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.225194243 +0000 UTC m=+150.914813124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.731100 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" event={"ID":"37ed9429-a67d-4168-b5e9-211eddd1abb1","Type":"ContainerStarted","Data":"6ca0a7b0231cfb76edee0cbd7ddbe8804d5f339e1946fe0ca5f47da7daf9f328"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.784983 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" event={"ID":"b32f0172-888e-4c74-9e70-becd273d49d8","Type":"ContainerStarted","Data":"e0c17958279bc263233295306e02f1597037bbfd801a0292a460fa43666db8e6"} Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.794388 4878 patch_prober.go:28] interesting pod/downloads-7954f5f757-zbpll container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.794518 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zbpll" podUID="44e99cf7-e103-451a-91cc-0d610e4190a9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.801739 4878 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5p678 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.801817 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" podUID="38d2716d-f5de-4242-a170-624490092b98" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.803613 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.812368 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wdjfj" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.815536 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-th582" podStartSLOduration=125.815498206 podStartE2EDuration="2m5.815498206s" podCreationTimestamp="2025-12-02 18:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.799069576 +0000 UTC m=+150.488688457" watchObservedRunningTime="2025-12-02 18:17:20.815498206 +0000 UTC m=+150.505117087" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.824093 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.826086 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.830552 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.330513098 +0000 UTC m=+151.020131979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.843748 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m6f2h" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.883740 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zvddh" podStartSLOduration=126.883716075 podStartE2EDuration="2m6.883716075s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.882703721 +0000 UTC m=+150.572322602" watchObservedRunningTime="2025-12-02 18:17:20.883716075 +0000 UTC m=+150.573334956" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.928930 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:20 crc kubenswrapper[4878]: E1202 18:17:20.931560 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.431537607 +0000 UTC m=+151.121156488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.993278 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-t7x7s" podStartSLOduration=126.993259225 podStartE2EDuration="2m6.993259225s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.992879262 +0000 UTC m=+150.682498143" watchObservedRunningTime="2025-12-02 18:17:20.993259225 +0000 UTC m=+150.682878116" Dec 02 18:17:20 crc kubenswrapper[4878]: I1202 18:17:20.993871 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wpxp8" podStartSLOduration=126.993866136 podStartE2EDuration="2m6.993866136s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:20.920842362 +0000 UTC m=+150.610461243" watchObservedRunningTime="2025-12-02 18:17:20.993866136 +0000 UTC m=+150.683485017" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.033081 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.033568 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.533549571 +0000 UTC m=+151.223168452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.136257 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.136638 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.636615429 +0000 UTC m=+151.326234310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.155820 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mtlxg" podStartSLOduration=127.155800054 podStartE2EDuration="2m7.155800054s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:21.152795952 +0000 UTC m=+150.842414833" watchObservedRunningTime="2025-12-02 18:17:21.155800054 +0000 UTC m=+150.845418935" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.241103 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.241568 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.741551922 +0000 UTC m=+151.431170803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.300355 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bv27h" podStartSLOduration=126.300335929 podStartE2EDuration="2m6.300335929s" podCreationTimestamp="2025-12-02 18:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:21.298726994 +0000 UTC m=+150.988345875" watchObservedRunningTime="2025-12-02 18:17:21.300335929 +0000 UTC m=+150.989954810" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.342829 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.343218 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.843201922 +0000 UTC m=+151.532820803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.370947 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fbkfc" podStartSLOduration=127.370932079 podStartE2EDuration="2m7.370932079s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:21.369948296 +0000 UTC m=+151.059567177" watchObservedRunningTime="2025-12-02 18:17:21.370932079 +0000 UTC m=+151.060550960" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.444133 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.444682 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:21.944657426 +0000 UTC m=+151.634276307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.545387 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.545599 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.045556581 +0000 UTC m=+151.735175462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.545668 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.546071 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.046063289 +0000 UTC m=+151.735682170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.591253 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d2f8f" podStartSLOduration=127.59121238 podStartE2EDuration="2m7.59121238s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:21.590432953 +0000 UTC m=+151.280051834" watchObservedRunningTime="2025-12-02 18:17:21.59121238 +0000 UTC m=+151.280831261" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.593883 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" podStartSLOduration=127.593875281 podStartE2EDuration="2m7.593875281s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:21.439338505 +0000 UTC m=+151.128957386" watchObservedRunningTime="2025-12-02 18:17:21.593875281 +0000 UTC m=+151.283494162" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.619876 4878 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qprtc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.619959 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" podUID="cbe20149-0391-43f7-957b-ad3b18f54736" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.647200 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.647565 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.147549814 +0000 UTC m=+151.837168695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.650695 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:21 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:21 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:21 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.650739 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.706343 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vv2f" podStartSLOduration=127.706327901 podStartE2EDuration="2m7.706327901s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:21.704665683 +0000 UTC m=+151.394284564" watchObservedRunningTime="2025-12-02 18:17:21.706327901 +0000 UTC m=+151.395946782" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.748930 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.749294 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.249278596 +0000 UTC m=+151.938897477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.782335 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" event={"ID":"f45d88bc-0c44-4669-8394-081c7e4a8035","Type":"ContainerStarted","Data":"3b075cf2e13ebc908425a98010c927ae1c4d6ac94f72af3703cd3af61e6e2334"} Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.790464 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" event={"ID":"37ed9429-a67d-4168-b5e9-211eddd1abb1","Type":"ContainerStarted","Data":"65cc03e928b855798c833640cdd5125b9bc802ad94b56196809302447f053488"} Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.790502 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" event={"ID":"37ed9429-a67d-4168-b5e9-211eddd1abb1","Type":"ContainerStarted","Data":"8d474df18250ffafff14d273541bf58018907ce3c21110c5eb36b18bded38f95"} Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.793521 4878 patch_prober.go:28] interesting pod/downloads-7954f5f757-zbpll container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.793561 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zbpll" podUID="44e99cf7-e103-451a-91cc-0d610e4190a9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.809539 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.852913 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.853217 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.871553 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.371527981 +0000 UTC m=+152.061146862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.911685 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.965287 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.965348 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.965383 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.965409 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:21 crc kubenswrapper[4878]: E1202 18:17:21.969720 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.469697912 +0000 UTC m=+152.159316792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.971829 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.972874 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" podStartSLOduration=127.972849689 podStartE2EDuration="2m7.972849689s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:21.794965596 +0000 UTC m=+151.484584477" watchObservedRunningTime="2025-12-02 18:17:21.972849689 +0000 UTC m=+151.662468570" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.988795 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:21 crc kubenswrapper[4878]: I1202 18:17:21.992892 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.041293 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sb7zn"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.052170 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.061855 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.068189 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.068357 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.568328539 +0000 UTC m=+152.257947420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.068443 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.068900 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.568879658 +0000 UTC m=+152.258498539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.072505 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dmpvh" podStartSLOduration=10.072483361 podStartE2EDuration="10.072483361s" podCreationTimestamp="2025-12-02 18:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:22.054326981 +0000 UTC m=+151.743945852" watchObservedRunningTime="2025-12-02 18:17:22.072483361 +0000 UTC m=+151.762102242" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.077495 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb7zn"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.169641 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.169838 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-utilities\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.169861 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q44qr\" (UniqueName: \"kubernetes.io/projected/75d2e014-a578-4394-969c-109c2a260296-kube-api-access-q44qr\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.169894 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-catalog-content\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.170000 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.6699855 +0000 UTC m=+152.359604381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.223383 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-s5t58" podStartSLOduration=128.223352722 podStartE2EDuration="2m8.223352722s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:22.16910576 +0000 UTC m=+151.858724651" watchObservedRunningTime="2025-12-02 18:17:22.223352722 +0000 UTC m=+151.912971603" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.224352 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kwf9n"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.225500 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.232373 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.251588 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" podStartSLOduration=128.251564035 podStartE2EDuration="2m8.251564035s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:22.231859953 +0000 UTC m=+151.921478824" watchObservedRunningTime="2025-12-02 18:17:22.251564035 +0000 UTC m=+151.941182916" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.252800 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwf9n"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.261458 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.270727 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.271806 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsxxr\" (UniqueName: \"kubernetes.io/projected/e80c5f56-57a3-4778-8382-473cd7678252-kube-api-access-bsxxr\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.271928 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-utilities\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.272007 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q44qr\" (UniqueName: \"kubernetes.io/projected/75d2e014-a578-4394-969c-109c2a260296-kube-api-access-q44qr\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.272098 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-catalog-content\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.272222 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-utilities\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.272323 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-catalog-content\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.272417 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.272751 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.772738098 +0000 UTC m=+152.462356979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.273350 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-utilities\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.273919 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-catalog-content\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.275885 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.342053 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q44qr\" (UniqueName: \"kubernetes.io/projected/75d2e014-a578-4394-969c-109c2a260296-kube-api-access-q44qr\") pod \"community-operators-sb7zn\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.377788 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.377832 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.377996 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.877966211 +0000 UTC m=+152.567585092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.378479 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.378573 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsxxr\" (UniqueName: \"kubernetes.io/projected/e80c5f56-57a3-4778-8382-473cd7678252-kube-api-access-bsxxr\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.378687 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-utilities\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.378757 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-catalog-content\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.379280 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-catalog-content\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.379599 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-utilities\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.379749 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.879735491 +0000 UTC m=+152.569354462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.427612 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cdp4z"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.429020 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.440420 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsxxr\" (UniqueName: \"kubernetes.io/projected/e80c5f56-57a3-4778-8382-473cd7678252-kube-api-access-bsxxr\") pod \"certified-operators-kwf9n\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.440787 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" podStartSLOduration=127.440764155 podStartE2EDuration="2m7.440764155s" podCreationTimestamp="2025-12-02 18:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:22.420898956 +0000 UTC m=+152.110517837" watchObservedRunningTime="2025-12-02 18:17:22.440764155 +0000 UTC m=+152.130383036" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.480011 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.480191 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q77sh\" (UniqueName: \"kubernetes.io/projected/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-kube-api-access-q77sh\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.480268 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-utilities\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.480308 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-catalog-content\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.480450 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:22.980432589 +0000 UTC m=+152.670051470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.483366 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdp4z"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.574623 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.587999 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-catalog-content\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.588088 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.588110 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q77sh\" (UniqueName: \"kubernetes.io/projected/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-kube-api-access-q77sh\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.588135 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-utilities\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.588601 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-utilities\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.588817 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-catalog-content\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.589083 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.089071668 +0000 UTC m=+152.778690549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.620076 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-55w98"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.621112 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.635066 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q77sh\" (UniqueName: \"kubernetes.io/projected/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-kube-api-access-q77sh\") pod \"community-operators-cdp4z\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.653202 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55w98"] Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.654540 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:22 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:22 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:22 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.654618 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.689729 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.689984 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-catalog-content\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.690086 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-utilities\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.690138 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmj7r\" (UniqueName: \"kubernetes.io/projected/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-kube-api-access-nmj7r\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.690446 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.190415148 +0000 UTC m=+152.880034029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.764208 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.792914 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.793493 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-utilities\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.793538 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmj7r\" (UniqueName: \"kubernetes.io/projected/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-kube-api-access-nmj7r\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.793587 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-catalog-content\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.794104 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-catalog-content\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.794647 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.294608206 +0000 UTC m=+152.984227087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.795443 4878 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qprtc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.797330 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" podUID="cbe20149-0391-43f7-957b-ad3b18f54736" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.794711 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-utilities\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.888663 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zqt5l" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.893609 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmj7r\" (UniqueName: \"kubernetes.io/projected/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-kube-api-access-nmj7r\") pod \"certified-operators-55w98\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.894192 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.894386 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.394351631 +0000 UTC m=+153.083970512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.894532 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.896295 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.396270607 +0000 UTC m=+153.085889488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.959851 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:17:22 crc kubenswrapper[4878]: I1202 18:17:22.996188 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:22 crc kubenswrapper[4878]: E1202 18:17:22.998155 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.498134134 +0000 UTC m=+153.187753015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.041798 4878 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.100593 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.600576522 +0000 UTC m=+153.290195413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.100775 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.201939 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.202414 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.702392808 +0000 UTC m=+153.392011689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.305072 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.305410 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.805396384 +0000 UTC m=+153.495015265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.406334 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.406654 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.90662267 +0000 UTC m=+153.596241551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.406761 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.407174 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:23.907159019 +0000 UTC m=+153.596777900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.507419 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.507634 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:24.007595468 +0000 UTC m=+153.697214349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.507728 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.508163 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:24.008154147 +0000 UTC m=+153.697773028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.557520 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwf9n"] Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.609122 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.609354 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:24.109325671 +0000 UTC m=+153.798944552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.609880 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.610543 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:24.110510801 +0000 UTC m=+153.800129682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.658819 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.668524 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.668520 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:23 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:23 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:23 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.669180 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.672056 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.672410 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.694334 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.717663 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.743204 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:24.243171261 +0000 UTC m=+153.932790142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.743518 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.743733 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.743822 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.743853 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.743905 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.744311 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 18:17:24.24430247 +0000 UTC m=+153.933921351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6hxvm" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.844813 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.845329 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.845367 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.845466 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:23 crc kubenswrapper[4878]: E1202 18:17:23.845532 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 18:17:24.345519044 +0000 UTC m=+154.035137925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.892524 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"eca1ef5b4091e3f700fa17999ac4fe88b7b11796740d835dc1711d7f61e47437"} Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.902350 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" event={"ID":"37ed9429-a67d-4168-b5e9-211eddd1abb1","Type":"ContainerStarted","Data":"fc74c02f045b43f3eeecd36602f2875b3c4bbb214e5cb8e01a47fe8a11af7323"} Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.906036 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.912771 4878 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T18:17:23.041852597Z","Handler":null,"Name":""} Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.920691 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwf9n" event={"ID":"e80c5f56-57a3-4778-8382-473cd7678252","Type":"ContainerStarted","Data":"1640ce8adf5c9b2566eb4114ad2f33a2e9a9acdeb1e91a22ecbf83b5215b7952"} Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.936165 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb7zn"] Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.945796 4878 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.945851 4878 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.947723 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.956169 4878 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 18:17:23 crc kubenswrapper[4878]: I1202 18:17:23.956251 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.025054 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-f59ml" podStartSLOduration=12.025007762 podStartE2EDuration="12.025007762s" podCreationTimestamp="2025-12-02 18:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:23.982260524 +0000 UTC m=+153.671879405" watchObservedRunningTime="2025-12-02 18:17:24.025007762 +0000 UTC m=+153.714626643" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.033779 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7m7gk"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.035261 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.041800 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.073361 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.073409 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m7gk"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.100069 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6hxvm\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.111130 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55w98"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.147305 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cdp4z"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.176890 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.177272 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-utilities\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.177347 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-catalog-content\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.177421 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw6dd\" (UniqueName: \"kubernetes.io/projected/c3d67205-fea8-475d-b3da-bd4fc55a58c4-kube-api-access-rw6dd\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.180146 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.279279 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-utilities\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.279341 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-catalog-content\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.279380 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw6dd\" (UniqueName: \"kubernetes.io/projected/c3d67205-fea8-475d-b3da-bd4fc55a58c4-kube-api-access-rw6dd\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.280111 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-utilities\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.285125 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-catalog-content\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.332280 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw6dd\" (UniqueName: \"kubernetes.io/projected/c3d67205-fea8-475d-b3da-bd4fc55a58c4-kube-api-access-rw6dd\") pod \"redhat-marketplace-7m7gk\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.378941 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.410558 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5hp2k"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.411926 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.431749 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hp2k"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.453685 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.486298 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wk5k\" (UniqueName: \"kubernetes.io/projected/c44c93ec-f58d-410e-8d64-1888c470cffe-kube-api-access-9wk5k\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.486349 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-catalog-content\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.486384 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-utilities\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.596124 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wk5k\" (UniqueName: \"kubernetes.io/projected/c44c93ec-f58d-410e-8d64-1888c470cffe-kube-api-access-9wk5k\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.596205 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-catalog-content\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.596260 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-utilities\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.596844 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-utilities\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.596924 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-catalog-content\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.644612 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wk5k\" (UniqueName: \"kubernetes.io/projected/c44c93ec-f58d-410e-8d64-1888c470cffe-kube-api-access-9wk5k\") pod \"redhat-marketplace-5hp2k\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.656406 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:24 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:24 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:24 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.656940 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.762783 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.856250 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.961623 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962570 4878 generic.go:334] "Generic (PLEG): container finished" podID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerID="e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de" exitCode=0 Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962792 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6hxvm"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962816 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m7gk"] Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962829 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bf3e99fcea4cfa3a14a2b586618479fa2ec3935acb4f9966d47baa5241ef9699"} Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962846 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8fcce7ea8629525349921a53a827f0ec383a1550f979f57a25c3176dac5e19e6"} Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962860 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e46e6c94-cea3-4688-ae6c-05520a9ac2aa","Type":"ContainerStarted","Data":"965230fd32da76bc92079829a886b1ba835ee886bc26f243c60a401dd1cba2a4"} Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962881 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55w98" event={"ID":"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa","Type":"ContainerDied","Data":"e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de"} Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.962896 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55w98" event={"ID":"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa","Type":"ContainerStarted","Data":"385c48a7c5342455716b6224881583255dad67f2d82585683dd3d1292aba61c4"} Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.988989 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.989916 4878 generic.go:334] "Generic (PLEG): container finished" podID="e80c5f56-57a3-4778-8382-473cd7678252" containerID="e0b035add7241ae6198680289ec4704519f33d9125d60f3c131feb0dab6f0127" exitCode=0 Dec 02 18:17:24 crc kubenswrapper[4878]: I1202 18:17:24.990011 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwf9n" event={"ID":"e80c5f56-57a3-4778-8382-473cd7678252","Type":"ContainerDied","Data":"e0b035add7241ae6198680289ec4704519f33d9125d60f3c131feb0dab6f0127"} Dec 02 18:17:25 crc kubenswrapper[4878]: W1202 18:17:25.014074 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763bb008_97f2_4e90_965c_5a7537ff0a57.slice/crio-a85ac8421623cf1d1a1db5e7703967e79e571b64778d23258fe46e657d6e5913 WatchSource:0}: Error finding container a85ac8421623cf1d1a1db5e7703967e79e571b64778d23258fe46e657d6e5913: Status 404 returned error can't find the container with id a85ac8421623cf1d1a1db5e7703967e79e571b64778d23258fe46e657d6e5913 Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.057688 4878 generic.go:334] "Generic (PLEG): container finished" podID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerID="ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd" exitCode=0 Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.057817 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdp4z" event={"ID":"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd","Type":"ContainerDied","Data":"ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd"} Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.057849 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdp4z" event={"ID":"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd","Type":"ContainerStarted","Data":"2b79cd772e2a4756ac0cfc5cf91f82f4d3c8d4e0155d2538c51346b02752ab1a"} Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.087917 4878 patch_prober.go:28] interesting pod/downloads-7954f5f757-zbpll container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.087990 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zbpll" podUID="44e99cf7-e103-451a-91cc-0d610e4190a9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.088058 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8e9355731ffce0d4f724c32d8d8ff0bd6d1ff539de5949f48fdd1325797d5b70"} Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.088118 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"84ca77cdf60ac616aa4c784c2b2a4a7ba847d2d938c8d027baf79466ba565cf8"} Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.088135 4878 patch_prober.go:28] interesting pod/downloads-7954f5f757-zbpll container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.088187 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zbpll" podUID="44e99cf7-e103-451a-91cc-0d610e4190a9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.088360 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.100622 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ba365f975b6f739a582de3f6915b5f4469f62a3ebea21ad9fceecd4020041e07"} Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.118907 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hp2k"] Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.124959 4878 generic.go:334] "Generic (PLEG): container finished" podID="75d2e014-a578-4394-969c-109c2a260296" containerID="ea5183a479858660c55e92b6da435fab977b3b0d13347558f4f87daa740bbeb2" exitCode=0 Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.125818 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7zn" event={"ID":"75d2e014-a578-4394-969c-109c2a260296","Type":"ContainerDied","Data":"ea5183a479858660c55e92b6da435fab977b3b0d13347558f4f87daa740bbeb2"} Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.125842 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7zn" event={"ID":"75d2e014-a578-4394-969c-109c2a260296","Type":"ContainerStarted","Data":"eeb30a1d45928d1454cf44ddbe0e664f63a38a8827e69fc5f2d37e1e94277cd3"} Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.195831 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.195870 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.201682 4878 patch_prober.go:28] interesting pod/console-f9d7485db-w6mf2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.201736 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-w6mf2" podUID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.331226 4878 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ffktl container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]log ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]etcd ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/max-in-flight-filter ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 02 18:17:25 crc kubenswrapper[4878]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 02 18:17:25 crc kubenswrapper[4878]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/project.openshift.io-projectcache ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/openshift.io-startinformers ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 02 18:17:25 crc kubenswrapper[4878]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 18:17:25 crc kubenswrapper[4878]: livez check failed Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.331638 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" podUID="6951ee33-1345-4ae2-906d-fdc7cea4dc64" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.404266 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v2c8h"] Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.405754 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.408774 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.423011 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2c8h"] Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.522794 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqbgq\" (UniqueName: \"kubernetes.io/projected/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-kube-api-access-sqbgq\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.522984 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-utilities\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.523025 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-catalog-content\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.624417 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-utilities\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.624467 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-catalog-content\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.624527 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqbgq\" (UniqueName: \"kubernetes.io/projected/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-kube-api-access-sqbgq\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.625080 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-utilities\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.625117 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-catalog-content\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.628810 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qprtc" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.641846 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.648323 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:25 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:25 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:25 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.648381 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.654554 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqbgq\" (UniqueName: \"kubernetes.io/projected/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-kube-api-access-sqbgq\") pod \"redhat-operators-v2c8h\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.776396 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.808121 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8qts"] Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.809453 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.824964 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8qts"] Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.930751 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-utilities\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.930808 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fqc\" (UniqueName: \"kubernetes.io/projected/df758941-afd5-4770-b93b-001f267dfcbf-kube-api-access-t2fqc\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:25 crc kubenswrapper[4878]: I1202 18:17:25.930840 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-catalog-content\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.010165 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.010541 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.023995 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.032554 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-utilities\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.032629 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fqc\" (UniqueName: \"kubernetes.io/projected/df758941-afd5-4770-b93b-001f267dfcbf-kube-api-access-t2fqc\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.032824 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-catalog-content\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.035100 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-catalog-content\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.035525 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-utilities\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.093972 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fqc\" (UniqueName: \"kubernetes.io/projected/df758941-afd5-4770-b93b-001f267dfcbf-kube-api-access-t2fqc\") pod \"redhat-operators-n8qts\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.133215 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" event={"ID":"763bb008-97f2-4e90-965c-5a7537ff0a57","Type":"ContainerStarted","Data":"0603c161c7fddd3c29dd6a0837fcba0d259686aec3939faf5f457423b61fcd84"} Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.133314 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" event={"ID":"763bb008-97f2-4e90-965c-5a7537ff0a57","Type":"ContainerStarted","Data":"a85ac8421623cf1d1a1db5e7703967e79e571b64778d23258fe46e657d6e5913"} Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.134776 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.139340 4878 generic.go:334] "Generic (PLEG): container finished" podID="e46e6c94-cea3-4688-ae6c-05520a9ac2aa" containerID="dd23c8db1e91265fef13a3d9eab38fc735c26b57945eda7346a9631823b93782" exitCode=0 Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.139622 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e46e6c94-cea3-4688-ae6c-05520a9ac2aa","Type":"ContainerDied","Data":"dd23c8db1e91265fef13a3d9eab38fc735c26b57945eda7346a9631823b93782"} Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.144595 4878 generic.go:334] "Generic (PLEG): container finished" podID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerID="11ad334364d0957b94c3922ca1eb918aabbc6c1c2a869bfa05d4a7516b1ce4c3" exitCode=0 Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.144897 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m7gk" event={"ID":"c3d67205-fea8-475d-b3da-bd4fc55a58c4","Type":"ContainerDied","Data":"11ad334364d0957b94c3922ca1eb918aabbc6c1c2a869bfa05d4a7516b1ce4c3"} Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.144946 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m7gk" event={"ID":"c3d67205-fea8-475d-b3da-bd4fc55a58c4","Type":"ContainerStarted","Data":"0c6147f198d32f8567d7e4397b90137ee1c7d4ebc6a7ef03c22b1b76f33659b2"} Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.148721 4878 generic.go:334] "Generic (PLEG): container finished" podID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerID="51d6562bb9b60008a914dab28a9f77739ce9cc88c8027bcf2104a26e3628d92c" exitCode=0 Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.149275 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hp2k" event={"ID":"c44c93ec-f58d-410e-8d64-1888c470cffe","Type":"ContainerDied","Data":"51d6562bb9b60008a914dab28a9f77739ce9cc88c8027bcf2104a26e3628d92c"} Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.149313 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hp2k" event={"ID":"c44c93ec-f58d-410e-8d64-1888c470cffe","Type":"ContainerStarted","Data":"a44fe74f386affb8d095ecfaf04695f4b43dcf5e6cc12ad6205b72d90aaeac94"} Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.158429 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.166026 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fvb6t" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.169755 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" podStartSLOduration=132.169734116 podStartE2EDuration="2m12.169734116s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:17:26.164820208 +0000 UTC m=+155.854439089" watchObservedRunningTime="2025-12-02 18:17:26.169734116 +0000 UTC m=+155.859352997" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.453147 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2c8h"] Dec 02 18:17:26 crc kubenswrapper[4878]: W1202 18:17:26.500656 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f6bebd_ceea_4c7a_8b33_4b8c7aa6b1a8.slice/crio-fcf0059d7f7087a35f5e3588f1d976ff4863cf53bb43045ad4d24ccb32a2b3e7 WatchSource:0}: Error finding container fcf0059d7f7087a35f5e3588f1d976ff4863cf53bb43045ad4d24ccb32a2b3e7: Status 404 returned error can't find the container with id fcf0059d7f7087a35f5e3588f1d976ff4863cf53bb43045ad4d24ccb32a2b3e7 Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.656888 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:26 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:26 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:26 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.657353 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:26 crc kubenswrapper[4878]: I1202 18:17:26.824045 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8qts"] Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.183008 4878 generic.go:334] "Generic (PLEG): container finished" podID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerID="27e28c5e8a69fe690b429046479f38c7c2cc65a05c91689380e4961c4582c061" exitCode=0 Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.183220 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2c8h" event={"ID":"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8","Type":"ContainerDied","Data":"27e28c5e8a69fe690b429046479f38c7c2cc65a05c91689380e4961c4582c061"} Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.183478 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2c8h" event={"ID":"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8","Type":"ContainerStarted","Data":"fcf0059d7f7087a35f5e3588f1d976ff4863cf53bb43045ad4d24ccb32a2b3e7"} Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.207649 4878 generic.go:334] "Generic (PLEG): container finished" podID="df758941-afd5-4770-b93b-001f267dfcbf" containerID="bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7" exitCode=0 Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.207772 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8qts" event={"ID":"df758941-afd5-4770-b93b-001f267dfcbf","Type":"ContainerDied","Data":"bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7"} Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.207823 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8qts" event={"ID":"df758941-afd5-4770-b93b-001f267dfcbf","Type":"ContainerStarted","Data":"a4c02bdd65ccbbf52073074f7ae9d9a27e1750aacf108d8c065569008564009b"} Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.229666 4878 generic.go:334] "Generic (PLEG): container finished" podID="298c0078-6d02-4963-a1fb-0f6713e6d369" containerID="3225532b534ec187daab9b2536a58912dece1a41bb317aa0b7d22fc707ce2393" exitCode=0 Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.229878 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" event={"ID":"298c0078-6d02-4963-a1fb-0f6713e6d369","Type":"ContainerDied","Data":"3225532b534ec187daab9b2536a58912dece1a41bb317aa0b7d22fc707ce2393"} Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.607765 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.646349 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:27 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:27 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:27 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.646476 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.678359 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kube-api-access\") pod \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.678510 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kubelet-dir\") pod \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\" (UID: \"e46e6c94-cea3-4688-ae6c-05520a9ac2aa\") " Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.680076 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e46e6c94-cea3-4688-ae6c-05520a9ac2aa" (UID: "e46e6c94-cea3-4688-ae6c-05520a9ac2aa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.682134 4878 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.698502 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e46e6c94-cea3-4688-ae6c-05520a9ac2aa" (UID: "e46e6c94-cea3-4688-ae6c-05520a9ac2aa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:17:27 crc kubenswrapper[4878]: I1202 18:17:27.784467 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e46e6c94-cea3-4688-ae6c-05520a9ac2aa-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.251622 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.251723 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e46e6c94-cea3-4688-ae6c-05520a9ac2aa","Type":"ContainerDied","Data":"965230fd32da76bc92079829a886b1ba835ee886bc26f243c60a401dd1cba2a4"} Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.251830 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="965230fd32da76bc92079829a886b1ba835ee886bc26f243c60a401dd1cba2a4" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.646380 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:28 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:28 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:28 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.646883 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.728422 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.814712 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/298c0078-6d02-4963-a1fb-0f6713e6d369-config-volume\") pod \"298c0078-6d02-4963-a1fb-0f6713e6d369\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.814817 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4ht8\" (UniqueName: \"kubernetes.io/projected/298c0078-6d02-4963-a1fb-0f6713e6d369-kube-api-access-h4ht8\") pod \"298c0078-6d02-4963-a1fb-0f6713e6d369\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.814961 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/298c0078-6d02-4963-a1fb-0f6713e6d369-secret-volume\") pod \"298c0078-6d02-4963-a1fb-0f6713e6d369\" (UID: \"298c0078-6d02-4963-a1fb-0f6713e6d369\") " Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.815964 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298c0078-6d02-4963-a1fb-0f6713e6d369-config-volume" (OuterVolumeSpecName: "config-volume") pod "298c0078-6d02-4963-a1fb-0f6713e6d369" (UID: "298c0078-6d02-4963-a1fb-0f6713e6d369"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.822058 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/298c0078-6d02-4963-a1fb-0f6713e6d369-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "298c0078-6d02-4963-a1fb-0f6713e6d369" (UID: "298c0078-6d02-4963-a1fb-0f6713e6d369"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.822891 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298c0078-6d02-4963-a1fb-0f6713e6d369-kube-api-access-h4ht8" (OuterVolumeSpecName: "kube-api-access-h4ht8") pod "298c0078-6d02-4963-a1fb-0f6713e6d369" (UID: "298c0078-6d02-4963-a1fb-0f6713e6d369"). InnerVolumeSpecName "kube-api-access-h4ht8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.881896 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 18:17:28 crc kubenswrapper[4878]: E1202 18:17:28.882873 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46e6c94-cea3-4688-ae6c-05520a9ac2aa" containerName="pruner" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.883006 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46e6c94-cea3-4688-ae6c-05520a9ac2aa" containerName="pruner" Dec 02 18:17:28 crc kubenswrapper[4878]: E1202 18:17:28.883076 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="298c0078-6d02-4963-a1fb-0f6713e6d369" containerName="collect-profiles" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.883163 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="298c0078-6d02-4963-a1fb-0f6713e6d369" containerName="collect-profiles" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.883450 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="298c0078-6d02-4963-a1fb-0f6713e6d369" containerName="collect-profiles" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.883570 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46e6c94-cea3-4688-ae6c-05520a9ac2aa" containerName="pruner" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.884737 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.889937 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.890323 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.911850 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.916331 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4ht8\" (UniqueName: \"kubernetes.io/projected/298c0078-6d02-4963-a1fb-0f6713e6d369-kube-api-access-h4ht8\") on node \"crc\" DevicePath \"\"" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.916356 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/298c0078-6d02-4963-a1fb-0f6713e6d369-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 18:17:28 crc kubenswrapper[4878]: I1202 18:17:28.916366 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/298c0078-6d02-4963-a1fb-0f6713e6d369-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.017661 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23837c67-a408-45bd-a0f2-141fd31ddc77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.017737 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23837c67-a408-45bd-a0f2-141fd31ddc77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.119320 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23837c67-a408-45bd-a0f2-141fd31ddc77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.119396 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23837c67-a408-45bd-a0f2-141fd31ddc77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.119465 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23837c67-a408-45bd-a0f2-141fd31ddc77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.140471 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23837c67-a408-45bd-a0f2-141fd31ddc77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.229948 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.327161 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" event={"ID":"298c0078-6d02-4963-a1fb-0f6713e6d369","Type":"ContainerDied","Data":"49974d191188131dd988f2557e822cab1bd73b46372259d9a025ac0ac999b39b"} Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.327214 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49974d191188131dd988f2557e822cab1bd73b46372259d9a025ac0ac999b39b" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.327261 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf" Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.599596 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 18:17:29 crc kubenswrapper[4878]: W1202 18:17:29.625163 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod23837c67_a408_45bd_a0f2_141fd31ddc77.slice/crio-d020bbd7410d0408d4211a22f542004187e23c9f446cb60800e3acdd60ec70bf WatchSource:0}: Error finding container d020bbd7410d0408d4211a22f542004187e23c9f446cb60800e3acdd60ec70bf: Status 404 returned error can't find the container with id d020bbd7410d0408d4211a22f542004187e23c9f446cb60800e3acdd60ec70bf Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.648046 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:29 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:29 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:29 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:29 crc kubenswrapper[4878]: I1202 18:17:29.648121 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:30 crc kubenswrapper[4878]: I1202 18:17:30.318487 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:30 crc kubenswrapper[4878]: I1202 18:17:30.324026 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ffktl" Dec 02 18:17:30 crc kubenswrapper[4878]: I1202 18:17:30.337797 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"23837c67-a408-45bd-a0f2-141fd31ddc77","Type":"ContainerStarted","Data":"d020bbd7410d0408d4211a22f542004187e23c9f446cb60800e3acdd60ec70bf"} Dec 02 18:17:30 crc kubenswrapper[4878]: I1202 18:17:30.648671 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:30 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:30 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:30 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:30 crc kubenswrapper[4878]: I1202 18:17:30.648737 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:30 crc kubenswrapper[4878]: I1202 18:17:30.749075 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dmpvh" Dec 02 18:17:31 crc kubenswrapper[4878]: I1202 18:17:31.385543 4878 generic.go:334] "Generic (PLEG): container finished" podID="23837c67-a408-45bd-a0f2-141fd31ddc77" containerID="34abeb4c5d58e5cb1989e0f14fb110e0072dd14bbd0d1083130a0fb936e3c931" exitCode=0 Dec 02 18:17:31 crc kubenswrapper[4878]: I1202 18:17:31.385595 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"23837c67-a408-45bd-a0f2-141fd31ddc77","Type":"ContainerDied","Data":"34abeb4c5d58e5cb1989e0f14fb110e0072dd14bbd0d1083130a0fb936e3c931"} Dec 02 18:17:31 crc kubenswrapper[4878]: I1202 18:17:31.645906 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:31 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:31 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:31 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:31 crc kubenswrapper[4878]: I1202 18:17:31.645994 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:32 crc kubenswrapper[4878]: I1202 18:17:32.645822 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:32 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:32 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:32 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:32 crc kubenswrapper[4878]: I1202 18:17:32.646417 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:33 crc kubenswrapper[4878]: I1202 18:17:33.645197 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:33 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:33 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:33 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:33 crc kubenswrapper[4878]: I1202 18:17:33.645295 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:34 crc kubenswrapper[4878]: I1202 18:17:34.644828 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:34 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:34 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:34 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:34 crc kubenswrapper[4878]: I1202 18:17:34.644930 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:35 crc kubenswrapper[4878]: I1202 18:17:35.097461 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zbpll" Dec 02 18:17:35 crc kubenswrapper[4878]: I1202 18:17:35.195602 4878 patch_prober.go:28] interesting pod/console-f9d7485db-w6mf2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 02 18:17:35 crc kubenswrapper[4878]: I1202 18:17:35.195661 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-w6mf2" podUID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 02 18:17:35 crc kubenswrapper[4878]: I1202 18:17:35.647835 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:35 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:35 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:35 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:35 crc kubenswrapper[4878]: I1202 18:17:35.648395 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:36 crc kubenswrapper[4878]: I1202 18:17:36.620451 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:17:36 crc kubenswrapper[4878]: I1202 18:17:36.644809 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:36 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:36 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:36 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:36 crc kubenswrapper[4878]: I1202 18:17:36.644883 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:37 crc kubenswrapper[4878]: I1202 18:17:37.472139 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:37 crc kubenswrapper[4878]: I1202 18:17:37.493470 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09adc15b-14dd-4a05-b569-4168b9ced169-metrics-certs\") pod \"network-metrics-daemon-dlwt8\" (UID: \"09adc15b-14dd-4a05-b569-4168b9ced169\") " pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:37 crc kubenswrapper[4878]: I1202 18:17:37.645375 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:37 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:37 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:37 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:37 crc kubenswrapper[4878]: I1202 18:17:37.645446 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:37 crc kubenswrapper[4878]: I1202 18:17:37.651832 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlwt8" Dec 02 18:17:38 crc kubenswrapper[4878]: I1202 18:17:38.694794 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:38 crc kubenswrapper[4878]: [-]has-synced failed: reason withheld Dec 02 18:17:38 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:38 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:38 crc kubenswrapper[4878]: I1202 18:17:38.695792 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:39 crc kubenswrapper[4878]: I1202 18:17:39.644479 4878 patch_prober.go:28] interesting pod/router-default-5444994796-28qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 18:17:39 crc kubenswrapper[4878]: [+]has-synced ok Dec 02 18:17:39 crc kubenswrapper[4878]: [+]process-running ok Dec 02 18:17:39 crc kubenswrapper[4878]: healthz check failed Dec 02 18:17:39 crc kubenswrapper[4878]: I1202 18:17:39.644553 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-28qqz" podUID="4c0a71e8-b6e5-44c8-a813-9177684ab97e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 18:17:40 crc kubenswrapper[4878]: I1202 18:17:40.646169 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:40 crc kubenswrapper[4878]: I1202 18:17:40.649631 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-28qqz" Dec 02 18:17:40 crc kubenswrapper[4878]: I1202 18:17:40.921297 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.028475 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23837c67-a408-45bd-a0f2-141fd31ddc77-kubelet-dir\") pod \"23837c67-a408-45bd-a0f2-141fd31ddc77\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.028548 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23837c67-a408-45bd-a0f2-141fd31ddc77-kube-api-access\") pod \"23837c67-a408-45bd-a0f2-141fd31ddc77\" (UID: \"23837c67-a408-45bd-a0f2-141fd31ddc77\") " Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.028617 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23837c67-a408-45bd-a0f2-141fd31ddc77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "23837c67-a408-45bd-a0f2-141fd31ddc77" (UID: "23837c67-a408-45bd-a0f2-141fd31ddc77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.028844 4878 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23837c67-a408-45bd-a0f2-141fd31ddc77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.047069 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23837c67-a408-45bd-a0f2-141fd31ddc77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "23837c67-a408-45bd-a0f2-141fd31ddc77" (UID: "23837c67-a408-45bd-a0f2-141fd31ddc77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.130559 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23837c67-a408-45bd-a0f2-141fd31ddc77-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.454332 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.454358 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"23837c67-a408-45bd-a0f2-141fd31ddc77","Type":"ContainerDied","Data":"d020bbd7410d0408d4211a22f542004187e23c9f446cb60800e3acdd60ec70bf"} Dec 02 18:17:41 crc kubenswrapper[4878]: I1202 18:17:41.454441 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d020bbd7410d0408d4211a22f542004187e23c9f446cb60800e3acdd60ec70bf" Dec 02 18:17:44 crc kubenswrapper[4878]: I1202 18:17:44.189622 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:17:45 crc kubenswrapper[4878]: I1202 18:17:45.200383 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:45 crc kubenswrapper[4878]: I1202 18:17:45.205124 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:17:53 crc kubenswrapper[4878]: I1202 18:17:53.742838 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:17:53 crc kubenswrapper[4878]: I1202 18:17:53.743729 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:17:55 crc kubenswrapper[4878]: I1202 18:17:55.616742 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-smztw" Dec 02 18:18:00 crc kubenswrapper[4878]: E1202 18:18:00.382457 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 18:18:00 crc kubenswrapper[4878]: E1202 18:18:00.382975 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rw6dd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7m7gk_openshift-marketplace(c3d67205-fea8-475d-b3da-bd4fc55a58c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 18:18:00 crc kubenswrapper[4878]: E1202 18:18:00.384373 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7m7gk" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.076687 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 18:18:02 crc kubenswrapper[4878]: E1202 18:18:02.077498 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23837c67-a408-45bd-a0f2-141fd31ddc77" containerName="pruner" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.077516 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="23837c67-a408-45bd-a0f2-141fd31ddc77" containerName="pruner" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.077657 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="23837c67-a408-45bd-a0f2-141fd31ddc77" containerName="pruner" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.078205 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.081276 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.084400 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.089630 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.173331 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c410f69d-3f96-47f4-bfa9-ca8728f17056-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.173377 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c410f69d-3f96-47f4-bfa9-ca8728f17056-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.274944 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c410f69d-3f96-47f4-bfa9-ca8728f17056-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.275009 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c410f69d-3f96-47f4-bfa9-ca8728f17056-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.275185 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c410f69d-3f96-47f4-bfa9-ca8728f17056-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.281160 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.299067 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c410f69d-3f96-47f4-bfa9-ca8728f17056-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:02 crc kubenswrapper[4878]: I1202 18:18:02.406438 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:04 crc kubenswrapper[4878]: E1202 18:18:04.067755 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7m7gk" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" Dec 02 18:18:04 crc kubenswrapper[4878]: E1202 18:18:04.146442 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 18:18:04 crc kubenswrapper[4878]: E1202 18:18:04.146725 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2fqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n8qts_openshift-marketplace(df758941-afd5-4770-b93b-001f267dfcbf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 18:18:04 crc kubenswrapper[4878]: E1202 18:18:04.147967 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n8qts" podUID="df758941-afd5-4770-b93b-001f267dfcbf" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.738811 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n8qts" podUID="df758941-afd5-4770-b93b-001f267dfcbf" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.843654 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.844227 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wk5k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5hp2k_openshift-marketplace(c44c93ec-f58d-410e-8d64-1888c470cffe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.845467 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5hp2k" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.854866 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.855048 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q44qr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sb7zn_openshift-marketplace(75d2e014-a578-4394-969c-109c2a260296): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.856940 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sb7zn" podUID="75d2e014-a578-4394-969c-109c2a260296" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.879623 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.879791 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqbgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v2c8h_openshift-marketplace(e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.881148 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v2c8h" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.900624 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.900804 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q77sh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cdp4z_openshift-marketplace(bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 18:18:05 crc kubenswrapper[4878]: E1202 18:18:05.902015 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cdp4z" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.175375 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 18:18:06 crc kubenswrapper[4878]: W1202 18:18:06.184175 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc410f69d_3f96_47f4_bfa9_ca8728f17056.slice/crio-ed1c9d325269ca665a843d0af18779c4650167f3e156ad020c45ba7a80e252d5 WatchSource:0}: Error finding container ed1c9d325269ca665a843d0af18779c4650167f3e156ad020c45ba7a80e252d5: Status 404 returned error can't find the container with id ed1c9d325269ca665a843d0af18779c4650167f3e156ad020c45ba7a80e252d5 Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.258746 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dlwt8"] Dec 02 18:18:06 crc kubenswrapper[4878]: W1202 18:18:06.272436 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09adc15b_14dd_4a05_b569_4168b9ced169.slice/crio-169dc5699c2361f106d02e93df548872f9a484fa017ea152614bd9d3c7a4b304 WatchSource:0}: Error finding container 169dc5699c2361f106d02e93df548872f9a484fa017ea152614bd9d3c7a4b304: Status 404 returned error can't find the container with id 169dc5699c2361f106d02e93df548872f9a484fa017ea152614bd9d3c7a4b304 Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.620141 4878 generic.go:334] "Generic (PLEG): container finished" podID="e80c5f56-57a3-4778-8382-473cd7678252" containerID="4ae69f9012dbb1ccdfb85cca33a1eeae8f73a5bdfd500d72b35c33099cc420ef" exitCode=0 Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.620220 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwf9n" event={"ID":"e80c5f56-57a3-4778-8382-473cd7678252","Type":"ContainerDied","Data":"4ae69f9012dbb1ccdfb85cca33a1eeae8f73a5bdfd500d72b35c33099cc420ef"} Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.625949 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" event={"ID":"09adc15b-14dd-4a05-b569-4168b9ced169","Type":"ContainerStarted","Data":"08dee41a964019bd37e80f1621533ed0ca7a18c58c7c9a38a0e2fb7391112813"} Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.625993 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" event={"ID":"09adc15b-14dd-4a05-b569-4168b9ced169","Type":"ContainerStarted","Data":"169dc5699c2361f106d02e93df548872f9a484fa017ea152614bd9d3c7a4b304"} Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.629387 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c410f69d-3f96-47f4-bfa9-ca8728f17056","Type":"ContainerStarted","Data":"fe85f6903f0a3bc8e15adf3c87070fb30919cca4390fa0bf15d430b4270b38ba"} Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.629413 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c410f69d-3f96-47f4-bfa9-ca8728f17056","Type":"ContainerStarted","Data":"ed1c9d325269ca665a843d0af18779c4650167f3e156ad020c45ba7a80e252d5"} Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.632067 4878 generic.go:334] "Generic (PLEG): container finished" podID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerID="ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f" exitCode=0 Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.632128 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55w98" event={"ID":"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa","Type":"ContainerDied","Data":"ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f"} Dec 02 18:18:06 crc kubenswrapper[4878]: E1202 18:18:06.636539 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v2c8h" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" Dec 02 18:18:06 crc kubenswrapper[4878]: E1202 18:18:06.636545 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sb7zn" podUID="75d2e014-a578-4394-969c-109c2a260296" Dec 02 18:18:06 crc kubenswrapper[4878]: E1202 18:18:06.637038 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cdp4z" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" Dec 02 18:18:06 crc kubenswrapper[4878]: E1202 18:18:06.644282 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5hp2k" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.657400 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.657376292 podStartE2EDuration="4.657376292s" podCreationTimestamp="2025-12-02 18:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:18:06.655328013 +0000 UTC m=+196.344946904" watchObservedRunningTime="2025-12-02 18:18:06.657376292 +0000 UTC m=+196.346995173" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.671842 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.672989 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.683052 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.740814 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-var-lock\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.740866 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc4e7b9e-998b-4eae-aa91-c20228510717-kube-api-access\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.740915 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.841467 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc4e7b9e-998b-4eae-aa91-c20228510717-kube-api-access\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.841539 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.841621 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-var-lock\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.841698 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-var-lock\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.841738 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.862907 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc4e7b9e-998b-4eae-aa91-c20228510717-kube-api-access\") pod \"installer-9-crc\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:06 crc kubenswrapper[4878]: I1202 18:18:06.998401 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.450202 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.641705 4878 generic.go:334] "Generic (PLEG): container finished" podID="c410f69d-3f96-47f4-bfa9-ca8728f17056" containerID="fe85f6903f0a3bc8e15adf3c87070fb30919cca4390fa0bf15d430b4270b38ba" exitCode=0 Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.641771 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c410f69d-3f96-47f4-bfa9-ca8728f17056","Type":"ContainerDied","Data":"fe85f6903f0a3bc8e15adf3c87070fb30919cca4390fa0bf15d430b4270b38ba"} Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.643571 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55w98" event={"ID":"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa","Type":"ContainerStarted","Data":"ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba"} Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.645678 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwf9n" event={"ID":"e80c5f56-57a3-4778-8382-473cd7678252","Type":"ContainerStarted","Data":"ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf"} Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.647291 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dlwt8" event={"ID":"09adc15b-14dd-4a05-b569-4168b9ced169","Type":"ContainerStarted","Data":"17a9055188053d0ce21152edfa847ffbb567a28bdf9135335013a7c1c48f0a1d"} Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.647892 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cc4e7b9e-998b-4eae-aa91-c20228510717","Type":"ContainerStarted","Data":"dc77a7db2dfdf339994ba86b35e26c92eef90ef49ebf83460a016d72cdf7a1b0"} Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.683319 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-55w98" podStartSLOduration=3.49621936 podStartE2EDuration="45.683297927s" podCreationTimestamp="2025-12-02 18:17:22 +0000 UTC" firstStartedPulling="2025-12-02 18:17:24.9885776 +0000 UTC m=+154.678196481" lastFinishedPulling="2025-12-02 18:18:07.175656167 +0000 UTC m=+196.865275048" observedRunningTime="2025-12-02 18:18:07.681814378 +0000 UTC m=+197.371433259" watchObservedRunningTime="2025-12-02 18:18:07.683297927 +0000 UTC m=+197.372916818" Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.703210 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kwf9n" podStartSLOduration=3.618374858 podStartE2EDuration="45.703187956s" podCreationTimestamp="2025-12-02 18:17:22 +0000 UTC" firstStartedPulling="2025-12-02 18:17:24.997557697 +0000 UTC m=+154.687176578" lastFinishedPulling="2025-12-02 18:18:07.082370795 +0000 UTC m=+196.771989676" observedRunningTime="2025-12-02 18:18:07.698974936 +0000 UTC m=+197.388593817" watchObservedRunningTime="2025-12-02 18:18:07.703187956 +0000 UTC m=+197.392806837" Dec 02 18:18:07 crc kubenswrapper[4878]: I1202 18:18:07.715198 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dlwt8" podStartSLOduration=173.715179613 podStartE2EDuration="2m53.715179613s" podCreationTimestamp="2025-12-02 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:18:07.714124549 +0000 UTC m=+197.403743420" watchObservedRunningTime="2025-12-02 18:18:07.715179613 +0000 UTC m=+197.404798484" Dec 02 18:18:08 crc kubenswrapper[4878]: I1202 18:18:08.662177 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cc4e7b9e-998b-4eae-aa91-c20228510717","Type":"ContainerStarted","Data":"2220f4d4a23db0f367782190a1fa537a938e0e0e97762631e7620631c3fe45b4"} Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.056550 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.069423 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.069401665 podStartE2EDuration="3.069401665s" podCreationTimestamp="2025-12-02 18:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:18:08.682335629 +0000 UTC m=+198.371954510" watchObservedRunningTime="2025-12-02 18:18:09.069401665 +0000 UTC m=+198.759020546" Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.072412 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvrhn"] Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.076880 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c410f69d-3f96-47f4-bfa9-ca8728f17056-kube-api-access\") pod \"c410f69d-3f96-47f4-bfa9-ca8728f17056\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.077049 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c410f69d-3f96-47f4-bfa9-ca8728f17056-kubelet-dir\") pod \"c410f69d-3f96-47f4-bfa9-ca8728f17056\" (UID: \"c410f69d-3f96-47f4-bfa9-ca8728f17056\") " Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.077170 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c410f69d-3f96-47f4-bfa9-ca8728f17056-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c410f69d-3f96-47f4-bfa9-ca8728f17056" (UID: "c410f69d-3f96-47f4-bfa9-ca8728f17056"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.077308 4878 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c410f69d-3f96-47f4-bfa9-ca8728f17056-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.083450 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c410f69d-3f96-47f4-bfa9-ca8728f17056-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c410f69d-3f96-47f4-bfa9-ca8728f17056" (UID: "c410f69d-3f96-47f4-bfa9-ca8728f17056"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.178920 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c410f69d-3f96-47f4-bfa9-ca8728f17056-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.669342 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c410f69d-3f96-47f4-bfa9-ca8728f17056","Type":"ContainerDied","Data":"ed1c9d325269ca665a843d0af18779c4650167f3e156ad020c45ba7a80e252d5"} Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.669402 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed1c9d325269ca665a843d0af18779c4650167f3e156ad020c45ba7a80e252d5" Dec 02 18:18:09 crc kubenswrapper[4878]: I1202 18:18:09.669420 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 18:18:12 crc kubenswrapper[4878]: I1202 18:18:12.575353 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:18:12 crc kubenswrapper[4878]: I1202 18:18:12.575892 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:18:12 crc kubenswrapper[4878]: I1202 18:18:12.961166 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:18:12 crc kubenswrapper[4878]: I1202 18:18:12.961211 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:18:13 crc kubenswrapper[4878]: I1202 18:18:13.333887 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:18:13 crc kubenswrapper[4878]: I1202 18:18:13.334034 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:18:13 crc kubenswrapper[4878]: I1202 18:18:13.390846 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:18:13 crc kubenswrapper[4878]: I1202 18:18:13.732655 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:18:14 crc kubenswrapper[4878]: I1202 18:18:14.854409 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55w98"] Dec 02 18:18:15 crc kubenswrapper[4878]: I1202 18:18:15.702707 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-55w98" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="registry-server" containerID="cri-o://ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba" gracePeriod=2 Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.361466 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.378886 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-catalog-content\") pod \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.378999 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmj7r\" (UniqueName: \"kubernetes.io/projected/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-kube-api-access-nmj7r\") pod \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.379123 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-utilities\") pod \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\" (UID: \"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa\") " Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.380308 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-utilities" (OuterVolumeSpecName: "utilities") pod "85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" (UID: "85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.385994 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-kube-api-access-nmj7r" (OuterVolumeSpecName: "kube-api-access-nmj7r") pod "85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" (UID: "85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa"). InnerVolumeSpecName "kube-api-access-nmj7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.441029 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" (UID: "85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.481418 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.481486 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.481514 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmj7r\" (UniqueName: \"kubernetes.io/projected/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa-kube-api-access-nmj7r\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.710044 4878 generic.go:334] "Generic (PLEG): container finished" podID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerID="ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba" exitCode=0 Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.710089 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55w98" event={"ID":"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa","Type":"ContainerDied","Data":"ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba"} Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.710117 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55w98" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.710140 4878 scope.go:117] "RemoveContainer" containerID="ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.710126 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55w98" event={"ID":"85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa","Type":"ContainerDied","Data":"385c48a7c5342455716b6224881583255dad67f2d82585683dd3d1292aba61c4"} Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.728881 4878 scope.go:117] "RemoveContainer" containerID="ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.742287 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55w98"] Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.747790 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-55w98"] Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.777848 4878 scope.go:117] "RemoveContainer" containerID="e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.794255 4878 scope.go:117] "RemoveContainer" containerID="ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba" Dec 02 18:18:16 crc kubenswrapper[4878]: E1202 18:18:16.795106 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba\": container with ID starting with ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba not found: ID does not exist" containerID="ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.795163 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba"} err="failed to get container status \"ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba\": rpc error: code = NotFound desc = could not find container \"ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba\": container with ID starting with ccdf2f94771e213ef9d66e94e14e9ae1feda3eb1bec4d1e1b6bb650d538e05ba not found: ID does not exist" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.795261 4878 scope.go:117] "RemoveContainer" containerID="ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f" Dec 02 18:18:16 crc kubenswrapper[4878]: E1202 18:18:16.795775 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f\": container with ID starting with ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f not found: ID does not exist" containerID="ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.795812 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f"} err="failed to get container status \"ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f\": rpc error: code = NotFound desc = could not find container \"ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f\": container with ID starting with ca817fa8951363559fb03159314f1480b01d908b23e201c75b62c497088caa3f not found: ID does not exist" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.795832 4878 scope.go:117] "RemoveContainer" containerID="e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de" Dec 02 18:18:16 crc kubenswrapper[4878]: E1202 18:18:16.796350 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de\": container with ID starting with e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de not found: ID does not exist" containerID="e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.796422 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de"} err="failed to get container status \"e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de\": rpc error: code = NotFound desc = could not find container \"e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de\": container with ID starting with e2356328ff7de8c147ca517657ee980b6f970ba9f6c370539c717bddc162a7de not found: ID does not exist" Dec 02 18:18:16 crc kubenswrapper[4878]: I1202 18:18:16.959768 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" path="/var/lib/kubelet/pods/85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa/volumes" Dec 02 18:18:18 crc kubenswrapper[4878]: I1202 18:18:18.729004 4878 generic.go:334] "Generic (PLEG): container finished" podID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerID="27bba871958df83fb86d9c7af464fe199b59c85f7eae81af9cdf8c2aa24d77e4" exitCode=0 Dec 02 18:18:18 crc kubenswrapper[4878]: I1202 18:18:18.729471 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hp2k" event={"ID":"c44c93ec-f58d-410e-8d64-1888c470cffe","Type":"ContainerDied","Data":"27bba871958df83fb86d9c7af464fe199b59c85f7eae81af9cdf8c2aa24d77e4"} Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.739589 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hp2k" event={"ID":"c44c93ec-f58d-410e-8d64-1888c470cffe","Type":"ContainerStarted","Data":"9dfd046b0eb04d24558b0196278d64cdabe6e283c68a5454c863acfb18f0d959"} Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.743285 4878 generic.go:334] "Generic (PLEG): container finished" podID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerID="7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d" exitCode=0 Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.743348 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdp4z" event={"ID":"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd","Type":"ContainerDied","Data":"7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d"} Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.745859 4878 generic.go:334] "Generic (PLEG): container finished" podID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerID="6656d76310ea3083e0912302a18a81128e753210f7305fc7e5f115c07d32185a" exitCode=0 Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.745951 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2c8h" event={"ID":"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8","Type":"ContainerDied","Data":"6656d76310ea3083e0912302a18a81128e753210f7305fc7e5f115c07d32185a"} Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.748324 4878 generic.go:334] "Generic (PLEG): container finished" podID="75d2e014-a578-4394-969c-109c2a260296" containerID="d9730355b9e8f808f0b72c022c8230b7d633a7b2036bc72e1f83efcaab62f932" exitCode=0 Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.748400 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7zn" event={"ID":"75d2e014-a578-4394-969c-109c2a260296","Type":"ContainerDied","Data":"d9730355b9e8f808f0b72c022c8230b7d633a7b2036bc72e1f83efcaab62f932"} Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.759510 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5hp2k" podStartSLOduration=2.664351297 podStartE2EDuration="55.759486481s" podCreationTimestamp="2025-12-02 18:17:24 +0000 UTC" firstStartedPulling="2025-12-02 18:17:26.152387003 +0000 UTC m=+155.842005884" lastFinishedPulling="2025-12-02 18:18:19.247522197 +0000 UTC m=+208.937141068" observedRunningTime="2025-12-02 18:18:19.758404315 +0000 UTC m=+209.448023206" watchObservedRunningTime="2025-12-02 18:18:19.759486481 +0000 UTC m=+209.449105362" Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.763501 4878 generic.go:334] "Generic (PLEG): container finished" podID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerID="0629ebb673a14bf094bd667066d2bd3788746440a908e4104f16e1332bcf7ad0" exitCode=0 Dec 02 18:18:19 crc kubenswrapper[4878]: I1202 18:18:19.763557 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m7gk" event={"ID":"c3d67205-fea8-475d-b3da-bd4fc55a58c4","Type":"ContainerDied","Data":"0629ebb673a14bf094bd667066d2bd3788746440a908e4104f16e1332bcf7ad0"} Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.776456 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7zn" event={"ID":"75d2e014-a578-4394-969c-109c2a260296","Type":"ContainerStarted","Data":"2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a"} Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.779809 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m7gk" event={"ID":"c3d67205-fea8-475d-b3da-bd4fc55a58c4","Type":"ContainerStarted","Data":"6951d51d80c8fdd4f2edb0740e074a13d1358c666025922626cff77a47fe77d2"} Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.783469 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdp4z" event={"ID":"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd","Type":"ContainerStarted","Data":"870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6"} Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.786476 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2c8h" event={"ID":"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8","Type":"ContainerStarted","Data":"19899a9328cfe34054cb3ed9596b9fad20af11c32473723375c1644424d79ebd"} Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.801690 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sb7zn" podStartSLOduration=3.717821055 podStartE2EDuration="58.801662542s" podCreationTimestamp="2025-12-02 18:17:22 +0000 UTC" firstStartedPulling="2025-12-02 18:17:25.129869694 +0000 UTC m=+154.819488585" lastFinishedPulling="2025-12-02 18:18:20.213711191 +0000 UTC m=+209.903330072" observedRunningTime="2025-12-02 18:18:20.798519438 +0000 UTC m=+210.488138319" watchObservedRunningTime="2025-12-02 18:18:20.801662542 +0000 UTC m=+210.491281423" Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.823033 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cdp4z" podStartSLOduration=3.773886624 podStartE2EDuration="58.823011809s" podCreationTimestamp="2025-12-02 18:17:22 +0000 UTC" firstStartedPulling="2025-12-02 18:17:25.072563187 +0000 UTC m=+154.762182068" lastFinishedPulling="2025-12-02 18:18:20.121688362 +0000 UTC m=+209.811307253" observedRunningTime="2025-12-02 18:18:20.821302113 +0000 UTC m=+210.510921004" watchObservedRunningTime="2025-12-02 18:18:20.823011809 +0000 UTC m=+210.512630690" Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.853131 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7m7gk" podStartSLOduration=3.835574897 podStartE2EDuration="57.853098187s" podCreationTimestamp="2025-12-02 18:17:23 +0000 UTC" firstStartedPulling="2025-12-02 18:17:26.1478789 +0000 UTC m=+155.837497781" lastFinishedPulling="2025-12-02 18:18:20.16540218 +0000 UTC m=+209.855021071" observedRunningTime="2025-12-02 18:18:20.852494557 +0000 UTC m=+210.542113448" watchObservedRunningTime="2025-12-02 18:18:20.853098187 +0000 UTC m=+210.542717058" Dec 02 18:18:20 crc kubenswrapper[4878]: I1202 18:18:20.892783 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v2c8h" podStartSLOduration=2.946886196 podStartE2EDuration="55.89276355s" podCreationTimestamp="2025-12-02 18:17:25 +0000 UTC" firstStartedPulling="2025-12-02 18:17:27.188745586 +0000 UTC m=+156.878364467" lastFinishedPulling="2025-12-02 18:18:20.13462292 +0000 UTC m=+209.824241821" observedRunningTime="2025-12-02 18:18:20.890877878 +0000 UTC m=+210.580496749" watchObservedRunningTime="2025-12-02 18:18:20.89276355 +0000 UTC m=+210.582382431" Dec 02 18:18:21 crc kubenswrapper[4878]: I1202 18:18:21.795076 4878 generic.go:334] "Generic (PLEG): container finished" podID="df758941-afd5-4770-b93b-001f267dfcbf" containerID="b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777" exitCode=0 Dec 02 18:18:21 crc kubenswrapper[4878]: I1202 18:18:21.795206 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8qts" event={"ID":"df758941-afd5-4770-b93b-001f267dfcbf","Type":"ContainerDied","Data":"b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777"} Dec 02 18:18:22 crc kubenswrapper[4878]: I1202 18:18:22.378931 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:18:22 crc kubenswrapper[4878]: I1202 18:18:22.379227 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:18:22 crc kubenswrapper[4878]: I1202 18:18:22.765130 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:18:22 crc kubenswrapper[4878]: I1202 18:18:22.765215 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:18:22 crc kubenswrapper[4878]: I1202 18:18:22.821793 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:18:23 crc kubenswrapper[4878]: I1202 18:18:23.423018 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sb7zn" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="registry-server" probeResult="failure" output=< Dec 02 18:18:23 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 18:18:23 crc kubenswrapper[4878]: > Dec 02 18:18:23 crc kubenswrapper[4878]: I1202 18:18:23.741964 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:18:23 crc kubenswrapper[4878]: I1202 18:18:23.742077 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:18:23 crc kubenswrapper[4878]: I1202 18:18:23.742159 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:18:23 crc kubenswrapper[4878]: I1202 18:18:23.743066 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:18:23 crc kubenswrapper[4878]: I1202 18:18:23.743184 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73" gracePeriod=600 Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.454426 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.454824 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.513819 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.763603 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.763681 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.809689 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.813655 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73" exitCode=0 Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.813742 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73"} Dec 02 18:18:24 crc kubenswrapper[4878]: I1202 18:18:24.858073 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:18:25 crc kubenswrapper[4878]: I1202 18:18:25.777654 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:18:25 crc kubenswrapper[4878]: I1202 18:18:25.777735 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:18:25 crc kubenswrapper[4878]: I1202 18:18:25.854116 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:18:26 crc kubenswrapper[4878]: I1202 18:18:26.822777 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v2c8h" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="registry-server" probeResult="failure" output=< Dec 02 18:18:26 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 18:18:26 crc kubenswrapper[4878]: > Dec 02 18:18:27 crc kubenswrapper[4878]: I1202 18:18:27.254534 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hp2k"] Dec 02 18:18:27 crc kubenswrapper[4878]: I1202 18:18:27.254853 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5hp2k" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="registry-server" containerID="cri-o://9dfd046b0eb04d24558b0196278d64cdabe6e283c68a5454c863acfb18f0d959" gracePeriod=2 Dec 02 18:18:28 crc kubenswrapper[4878]: I1202 18:18:28.841319 4878 generic.go:334] "Generic (PLEG): container finished" podID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerID="9dfd046b0eb04d24558b0196278d64cdabe6e283c68a5454c863acfb18f0d959" exitCode=0 Dec 02 18:18:28 crc kubenswrapper[4878]: I1202 18:18:28.841702 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hp2k" event={"ID":"c44c93ec-f58d-410e-8d64-1888c470cffe","Type":"ContainerDied","Data":"9dfd046b0eb04d24558b0196278d64cdabe6e283c68a5454c863acfb18f0d959"} Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.777939 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.850316 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5hp2k" event={"ID":"c44c93ec-f58d-410e-8d64-1888c470cffe","Type":"ContainerDied","Data":"a44fe74f386affb8d095ecfaf04695f4b43dcf5e6cc12ad6205b72d90aaeac94"} Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.850400 4878 scope.go:117] "RemoveContainer" containerID="9dfd046b0eb04d24558b0196278d64cdabe6e283c68a5454c863acfb18f0d959" Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.850551 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5hp2k" Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.855962 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8qts" event={"ID":"df758941-afd5-4770-b93b-001f267dfcbf","Type":"ContainerStarted","Data":"1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d"} Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.858897 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"4a12b6a48d5fa299bcb38ae1b9a61925e1420e83b36412a67e4692078c7172bd"} Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.868874 4878 scope.go:117] "RemoveContainer" containerID="27bba871958df83fb86d9c7af464fe199b59c85f7eae81af9cdf8c2aa24d77e4" Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.884874 4878 scope.go:117] "RemoveContainer" containerID="51d6562bb9b60008a914dab28a9f77739ce9cc88c8027bcf2104a26e3628d92c" Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.974853 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wk5k\" (UniqueName: \"kubernetes.io/projected/c44c93ec-f58d-410e-8d64-1888c470cffe-kube-api-access-9wk5k\") pod \"c44c93ec-f58d-410e-8d64-1888c470cffe\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.974932 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-catalog-content\") pod \"c44c93ec-f58d-410e-8d64-1888c470cffe\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.975006 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-utilities\") pod \"c44c93ec-f58d-410e-8d64-1888c470cffe\" (UID: \"c44c93ec-f58d-410e-8d64-1888c470cffe\") " Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.976530 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-utilities" (OuterVolumeSpecName: "utilities") pod "c44c93ec-f58d-410e-8d64-1888c470cffe" (UID: "c44c93ec-f58d-410e-8d64-1888c470cffe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:29 crc kubenswrapper[4878]: I1202 18:18:29.987942 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44c93ec-f58d-410e-8d64-1888c470cffe-kube-api-access-9wk5k" (OuterVolumeSpecName: "kube-api-access-9wk5k") pod "c44c93ec-f58d-410e-8d64-1888c470cffe" (UID: "c44c93ec-f58d-410e-8d64-1888c470cffe"). InnerVolumeSpecName "kube-api-access-9wk5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.017160 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c44c93ec-f58d-410e-8d64-1888c470cffe" (UID: "c44c93ec-f58d-410e-8d64-1888c470cffe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.076631 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wk5k\" (UniqueName: \"kubernetes.io/projected/c44c93ec-f58d-410e-8d64-1888c470cffe-kube-api-access-9wk5k\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.076664 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.076673 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c44c93ec-f58d-410e-8d64-1888c470cffe-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.187904 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hp2k"] Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.195801 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5hp2k"] Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.897120 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8qts" podStartSLOduration=5.093928702 podStartE2EDuration="1m5.897099866s" podCreationTimestamp="2025-12-02 18:17:25 +0000 UTC" firstStartedPulling="2025-12-02 18:17:27.212631432 +0000 UTC m=+156.902250313" lastFinishedPulling="2025-12-02 18:18:28.015802586 +0000 UTC m=+217.705421477" observedRunningTime="2025-12-02 18:18:30.89361758 +0000 UTC m=+220.583236461" watchObservedRunningTime="2025-12-02 18:18:30.897099866 +0000 UTC m=+220.586718757" Dec 02 18:18:30 crc kubenswrapper[4878]: I1202 18:18:30.947775 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" path="/var/lib/kubelet/pods/c44c93ec-f58d-410e-8d64-1888c470cffe/volumes" Dec 02 18:18:32 crc kubenswrapper[4878]: I1202 18:18:32.453164 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:18:32 crc kubenswrapper[4878]: I1202 18:18:32.518835 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:18:32 crc kubenswrapper[4878]: I1202 18:18:32.814058 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.118806 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" podUID="ef66151f-c39c-4f4d-bbc4-8c86555e41ea" containerName="oauth-openshift" containerID="cri-o://c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0" gracePeriod=15 Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.205041 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef66151f_c39c_4f4d_bbc4_8c86555e41ea.slice/crio-c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.557057 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.601935 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-qthqg"] Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602281 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="registry-server" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602298 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="registry-server" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602308 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="extract-content" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602315 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="extract-content" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602323 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="extract-utilities" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602329 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="extract-utilities" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602343 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c410f69d-3f96-47f4-bfa9-ca8728f17056" containerName="pruner" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602349 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c410f69d-3f96-47f4-bfa9-ca8728f17056" containerName="pruner" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602358 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="extract-utilities" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602363 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="extract-utilities" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602376 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="registry-server" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602383 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="registry-server" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602394 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef66151f-c39c-4f4d-bbc4-8c86555e41ea" containerName="oauth-openshift" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602401 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef66151f-c39c-4f4d-bbc4-8c86555e41ea" containerName="oauth-openshift" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.602410 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="extract-content" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602416 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="extract-content" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602512 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c410f69d-3f96-47f4-bfa9-ca8728f17056" containerName="pruner" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602523 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44c93ec-f58d-410e-8d64-1888c470cffe" containerName="registry-server" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602531 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef66151f-c39c-4f4d-bbc4-8c86555e41ea" containerName="oauth-openshift" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.602541 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d4afe8-89af-4ad1-8d8a-b6f28fbe68fa" containerName="registry-server" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.603026 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661624 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-serving-cert\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661720 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-policies\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661757 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp7tg\" (UniqueName: \"kubernetes.io/projected/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-kube-api-access-fp7tg\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661810 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-router-certs\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661845 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-ocp-branding-template\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661885 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-error\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661927 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-idp-0-file-data\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.661987 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-login\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.662027 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-cliconfig\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.662076 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-service-ca\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.662114 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-provider-selection\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.662184 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-dir\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.662256 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-session\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.662313 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-trusted-ca-bundle\") pod \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\" (UID: \"ef66151f-c39c-4f4d-bbc4-8c86555e41ea\") " Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.663648 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.664311 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.664795 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.675612 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.690229 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-qthqg"] Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.691075 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.692548 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.693184 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.693883 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-kube-api-access-fp7tg" (OuterVolumeSpecName: "kube-api-access-fp7tg") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "kube-api-access-fp7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.695565 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.695669 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.696586 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.705642 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.706320 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.707614 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ef66151f-c39c-4f4d-bbc4-8c86555e41ea" (UID: "ef66151f-c39c-4f4d-bbc4-8c86555e41ea"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764138 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764210 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-audit-policies\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764311 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764346 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-login\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764377 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764514 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-error\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764760 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-router-certs\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764789 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764824 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wcq2\" (UniqueName: \"kubernetes.io/projected/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-kube-api-access-2wcq2\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764853 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-service-ca\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764904 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764935 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-session\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.764956 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765076 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-audit-dir\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765550 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765581 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765596 4878 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765610 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp7tg\" (UniqueName: \"kubernetes.io/projected/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-kube-api-access-fp7tg\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765623 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765637 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765649 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765671 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765682 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765696 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765709 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765721 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765734 4878 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.765747 4878 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef66151f-c39c-4f4d-bbc4-8c86555e41ea-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867014 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-router-certs\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867071 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867094 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wcq2\" (UniqueName: \"kubernetes.io/projected/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-kube-api-access-2wcq2\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867114 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-service-ca\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867132 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867153 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-session\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867171 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867204 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-audit-dir\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867281 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867299 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-audit-policies\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867320 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867342 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-login\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867365 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867381 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-error\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.867704 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-audit-dir\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.868051 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-service-ca\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.868563 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-audit-policies\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.868585 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.868601 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.871490 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.871537 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.871607 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.871642 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-login\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.871840 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-user-template-error\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.871920 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-session\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.871987 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.874535 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-v4-0-config-system-router-certs\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.885574 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wcq2\" (UniqueName: \"kubernetes.io/projected/0f352f2a-21c0-437d-9e7a-dd2e6436f06d-kube-api-access-2wcq2\") pod \"oauth-openshift-76f84477b-qthqg\" (UID: \"0f352f2a-21c0-437d-9e7a-dd2e6436f06d\") " pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.892799 4878 generic.go:334] "Generic (PLEG): container finished" podID="ef66151f-c39c-4f4d-bbc4-8c86555e41ea" containerID="c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0" exitCode=0 Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.892839 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" event={"ID":"ef66151f-c39c-4f4d-bbc4-8c86555e41ea","Type":"ContainerDied","Data":"c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0"} Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.892867 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" event={"ID":"ef66151f-c39c-4f4d-bbc4-8c86555e41ea","Type":"ContainerDied","Data":"f69bc6b23a513da467436ad21857c408d3f92279f6e92f9d2a369f02d4c2445a"} Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.892885 4878 scope.go:117] "RemoveContainer" containerID="c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.892988 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wvrhn" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.922314 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvrhn"] Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.923300 4878 scope.go:117] "RemoveContainer" containerID="c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0" Dec 02 18:18:34 crc kubenswrapper[4878]: E1202 18:18:34.923894 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0\": container with ID starting with c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0 not found: ID does not exist" containerID="c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.923941 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0"} err="failed to get container status \"c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0\": rpc error: code = NotFound desc = could not find container \"c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0\": container with ID starting with c819d0674e2029b7249d1720e0e41b23a4bf04ca21c5c5a8cc6def2258dfe2a0 not found: ID does not exist" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.927133 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wvrhn"] Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.932088 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:34 crc kubenswrapper[4878]: I1202 18:18:34.945495 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef66151f-c39c-4f4d-bbc4-8c86555e41ea" path="/var/lib/kubelet/pods/ef66151f-c39c-4f4d-bbc4-8c86555e41ea/volumes" Dec 02 18:18:35 crc kubenswrapper[4878]: I1202 18:18:35.358889 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76f84477b-qthqg"] Dec 02 18:18:35 crc kubenswrapper[4878]: I1202 18:18:35.836896 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:18:35 crc kubenswrapper[4878]: I1202 18:18:35.894184 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:18:35 crc kubenswrapper[4878]: I1202 18:18:35.903647 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" event={"ID":"0f352f2a-21c0-437d-9e7a-dd2e6436f06d","Type":"ContainerStarted","Data":"a11de6a8dd5fec1235a8a5bd464a2d1ca5d8102bbb6a47b0f4905d98b630d119"} Dec 02 18:18:36 crc kubenswrapper[4878]: I1202 18:18:36.160413 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:18:36 crc kubenswrapper[4878]: I1202 18:18:36.161214 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:18:36 crc kubenswrapper[4878]: I1202 18:18:36.226900 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:18:36 crc kubenswrapper[4878]: I1202 18:18:36.854877 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdp4z"] Dec 02 18:18:36 crc kubenswrapper[4878]: I1202 18:18:36.855181 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cdp4z" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="registry-server" containerID="cri-o://870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6" gracePeriod=2 Dec 02 18:18:36 crc kubenswrapper[4878]: I1202 18:18:36.987492 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:18:37 crc kubenswrapper[4878]: I1202 18:18:37.921853 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" event={"ID":"0f352f2a-21c0-437d-9e7a-dd2e6436f06d","Type":"ContainerStarted","Data":"592357fdfd58ee40c0c9f0df4e10d07c7bd97c240446b9b8becdf903a25cf0d5"} Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.658208 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8qts"] Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.714167 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.824105 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q77sh\" (UniqueName: \"kubernetes.io/projected/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-kube-api-access-q77sh\") pod \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.824192 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-catalog-content\") pod \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.824378 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-utilities\") pod \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\" (UID: \"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd\") " Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.826336 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-utilities" (OuterVolumeSpecName: "utilities") pod "bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" (UID: "bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.826659 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.835283 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-kube-api-access-q77sh" (OuterVolumeSpecName: "kube-api-access-q77sh") pod "bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" (UID: "bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd"). InnerVolumeSpecName "kube-api-access-q77sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.893336 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" (UID: "bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.927511 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q77sh\" (UniqueName: \"kubernetes.io/projected/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-kube-api-access-q77sh\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.927543 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.928725 4878 generic.go:334] "Generic (PLEG): container finished" podID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerID="870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6" exitCode=0 Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.929675 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdp4z" event={"ID":"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd","Type":"ContainerDied","Data":"870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6"} Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.929711 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cdp4z" event={"ID":"bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd","Type":"ContainerDied","Data":"2b79cd772e2a4756ac0cfc5cf91f82f4d3c8d4e0155d2538c51346b02752ab1a"} Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.929727 4878 scope.go:117] "RemoveContainer" containerID="870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.930422 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cdp4z" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.930668 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.935622 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.946742 4878 scope.go:117] "RemoveContainer" containerID="7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.954834 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76f84477b-qthqg" podStartSLOduration=29.95481582 podStartE2EDuration="29.95481582s" podCreationTimestamp="2025-12-02 18:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:18:38.951739668 +0000 UTC m=+228.641358559" watchObservedRunningTime="2025-12-02 18:18:38.95481582 +0000 UTC m=+228.644434701" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.965193 4878 scope.go:117] "RemoveContainer" containerID="ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.996730 4878 scope.go:117] "RemoveContainer" containerID="870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6" Dec 02 18:18:38 crc kubenswrapper[4878]: E1202 18:18:38.997897 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6\": container with ID starting with 870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6 not found: ID does not exist" containerID="870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.998024 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6"} err="failed to get container status \"870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6\": rpc error: code = NotFound desc = could not find container \"870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6\": container with ID starting with 870233d9a4b555255aa70cb744a4fb1f07236f1995fc0cf2197f17ee1133a1c6 not found: ID does not exist" Dec 02 18:18:38 crc kubenswrapper[4878]: I1202 18:18:38.998126 4878 scope.go:117] "RemoveContainer" containerID="7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d" Dec 02 18:18:39 crc kubenswrapper[4878]: E1202 18:18:39.000522 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d\": container with ID starting with 7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d not found: ID does not exist" containerID="7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d" Dec 02 18:18:39 crc kubenswrapper[4878]: I1202 18:18:39.000581 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d"} err="failed to get container status \"7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d\": rpc error: code = NotFound desc = could not find container \"7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d\": container with ID starting with 7b81eafded7cfdfee0ba6c06461465f5e89793cef07b223845ed91e1b3187c9d not found: ID does not exist" Dec 02 18:18:39 crc kubenswrapper[4878]: I1202 18:18:39.000623 4878 scope.go:117] "RemoveContainer" containerID="ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd" Dec 02 18:18:39 crc kubenswrapper[4878]: E1202 18:18:39.000955 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd\": container with ID starting with ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd not found: ID does not exist" containerID="ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd" Dec 02 18:18:39 crc kubenswrapper[4878]: I1202 18:18:39.001030 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd"} err="failed to get container status \"ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd\": rpc error: code = NotFound desc = could not find container \"ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd\": container with ID starting with ce82bc6c05be0819e6dc9a4e46a74988b85b735ddc8c7ceaf1c6cd0416b876cd not found: ID does not exist" Dec 02 18:18:39 crc kubenswrapper[4878]: I1202 18:18:39.005514 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cdp4z"] Dec 02 18:18:39 crc kubenswrapper[4878]: I1202 18:18:39.016817 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cdp4z"] Dec 02 18:18:39 crc kubenswrapper[4878]: I1202 18:18:39.940687 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8qts" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="registry-server" containerID="cri-o://1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d" gracePeriod=2 Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.351376 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.450354 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2fqc\" (UniqueName: \"kubernetes.io/projected/df758941-afd5-4770-b93b-001f267dfcbf-kube-api-access-t2fqc\") pod \"df758941-afd5-4770-b93b-001f267dfcbf\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.450449 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-utilities\") pod \"df758941-afd5-4770-b93b-001f267dfcbf\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.450535 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-catalog-content\") pod \"df758941-afd5-4770-b93b-001f267dfcbf\" (UID: \"df758941-afd5-4770-b93b-001f267dfcbf\") " Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.451905 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-utilities" (OuterVolumeSpecName: "utilities") pod "df758941-afd5-4770-b93b-001f267dfcbf" (UID: "df758941-afd5-4770-b93b-001f267dfcbf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.471094 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df758941-afd5-4770-b93b-001f267dfcbf-kube-api-access-t2fqc" (OuterVolumeSpecName: "kube-api-access-t2fqc") pod "df758941-afd5-4770-b93b-001f267dfcbf" (UID: "df758941-afd5-4770-b93b-001f267dfcbf"). InnerVolumeSpecName "kube-api-access-t2fqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.552844 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2fqc\" (UniqueName: \"kubernetes.io/projected/df758941-afd5-4770-b93b-001f267dfcbf-kube-api-access-t2fqc\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.552886 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.601336 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df758941-afd5-4770-b93b-001f267dfcbf" (UID: "df758941-afd5-4770-b93b-001f267dfcbf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.655023 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df758941-afd5-4770-b93b-001f267dfcbf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.951886 4878 generic.go:334] "Generic (PLEG): container finished" podID="df758941-afd5-4770-b93b-001f267dfcbf" containerID="1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d" exitCode=0 Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.952196 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8qts" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.957521 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" path="/var/lib/kubelet/pods/bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd/volumes" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.958847 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8qts" event={"ID":"df758941-afd5-4770-b93b-001f267dfcbf","Type":"ContainerDied","Data":"1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d"} Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.958903 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8qts" event={"ID":"df758941-afd5-4770-b93b-001f267dfcbf","Type":"ContainerDied","Data":"a4c02bdd65ccbbf52073074f7ae9d9a27e1750aacf108d8c065569008564009b"} Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.958940 4878 scope.go:117] "RemoveContainer" containerID="1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d" Dec 02 18:18:40 crc kubenswrapper[4878]: I1202 18:18:40.991558 4878 scope.go:117] "RemoveContainer" containerID="b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777" Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.014585 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8qts"] Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.018121 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8qts"] Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.019515 4878 scope.go:117] "RemoveContainer" containerID="bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7" Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.051494 4878 scope.go:117] "RemoveContainer" containerID="1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d" Dec 02 18:18:41 crc kubenswrapper[4878]: E1202 18:18:41.052140 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d\": container with ID starting with 1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d not found: ID does not exist" containerID="1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d" Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.052205 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d"} err="failed to get container status \"1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d\": rpc error: code = NotFound desc = could not find container \"1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d\": container with ID starting with 1e05e505304b332c65f6070744b3304a10e03c7e86b2d8ceb9cc8db3670a801d not found: ID does not exist" Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.052269 4878 scope.go:117] "RemoveContainer" containerID="b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777" Dec 02 18:18:41 crc kubenswrapper[4878]: E1202 18:18:41.052850 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777\": container with ID starting with b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777 not found: ID does not exist" containerID="b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777" Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.052968 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777"} err="failed to get container status \"b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777\": rpc error: code = NotFound desc = could not find container \"b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777\": container with ID starting with b16427d05159326f3996f4c941d38d633470237868aca71823180d9adf707777 not found: ID does not exist" Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.053102 4878 scope.go:117] "RemoveContainer" containerID="bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7" Dec 02 18:18:41 crc kubenswrapper[4878]: E1202 18:18:41.053796 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7\": container with ID starting with bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7 not found: ID does not exist" containerID="bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7" Dec 02 18:18:41 crc kubenswrapper[4878]: I1202 18:18:41.053919 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7"} err="failed to get container status \"bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7\": rpc error: code = NotFound desc = could not find container \"bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7\": container with ID starting with bee773e0670e518819e6933926ac0b4e8750db8f65f29e208f0f69010055d7d7 not found: ID does not exist" Dec 02 18:18:42 crc kubenswrapper[4878]: I1202 18:18:42.951485 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df758941-afd5-4770-b93b-001f267dfcbf" path="/var/lib/kubelet/pods/df758941-afd5-4770-b93b-001f267dfcbf/volumes" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.677068 4878 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.678353 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864" gracePeriod=15 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.678611 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1" gracePeriod=15 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.678728 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab" gracePeriod=15 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.678821 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc" gracePeriod=15 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.678913 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003" gracePeriod=15 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.679882 4878 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.680476 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.680688 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.680853 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.681006 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.681139 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.681362 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.681501 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="extract-utilities" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.681617 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="extract-utilities" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.681736 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="registry-server" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.681912 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="registry-server" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.682060 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="extract-content" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.682196 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="extract-content" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.682357 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.682477 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.682600 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="extract-utilities" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.682717 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="extract-utilities" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.683765 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="extract-content" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.683801 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="extract-content" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.683813 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.683822 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.683840 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="registry-server" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.683850 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="registry-server" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.683861 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.683871 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.683882 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.683891 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684155 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684179 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684194 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684206 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684219 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc6a12eb-f6ee-4d7c-a9f6-ba2f64f371cd" containerName="registry-server" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684228 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684258 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="df758941-afd5-4770-b93b-001f267dfcbf" containerName="registry-server" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.684498 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.685794 4878 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.686373 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.697818 4878 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.746665 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.746799 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.746854 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.746915 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.747013 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.747063 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.747115 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.747174 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.847930 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.848091 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.848899 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.848805 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.849145 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.849401 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.849579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.849309 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.849449 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.849931 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.850036 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.850145 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.850249 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.850071 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.850500 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.850562 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:45 crc kubenswrapper[4878]: E1202 18:18:45.949606 4878 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" volumeName="registry-storage" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.991903 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.993366 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.994089 4878 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1" exitCode=0 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.994114 4878 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab" exitCode=0 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.994121 4878 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc" exitCode=0 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.994130 4878 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003" exitCode=2 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.994203 4878 scope.go:117] "RemoveContainer" containerID="9e63c61202fcae047166f45ad78993607f6c001771216fb3b427a37aa75941ba" Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.996323 4878 generic.go:334] "Generic (PLEG): container finished" podID="cc4e7b9e-998b-4eae-aa91-c20228510717" containerID="2220f4d4a23db0f367782190a1fa537a938e0e0e97762631e7620631c3fe45b4" exitCode=0 Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.996365 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cc4e7b9e-998b-4eae-aa91-c20228510717","Type":"ContainerDied","Data":"2220f4d4a23db0f367782190a1fa537a938e0e0e97762631e7620631c3fe45b4"} Dec 02 18:18:45 crc kubenswrapper[4878]: I1202 18:18:45.997045 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:46 crc kubenswrapper[4878]: E1202 18:18:46.725032 4878 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:46 crc kubenswrapper[4878]: E1202 18:18:46.725825 4878 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:46 crc kubenswrapper[4878]: E1202 18:18:46.726517 4878 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:46 crc kubenswrapper[4878]: E1202 18:18:46.726828 4878 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:46 crc kubenswrapper[4878]: E1202 18:18:46.727165 4878 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:46 crc kubenswrapper[4878]: I1202 18:18:46.727308 4878 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 18:18:46 crc kubenswrapper[4878]: E1202 18:18:46.727676 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="200ms" Dec 02 18:18:46 crc kubenswrapper[4878]: E1202 18:18:46.928817 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="400ms" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.009080 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 18:18:47 crc kubenswrapper[4878]: E1202 18:18:47.330264 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="800ms" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.383576 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.385210 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.490370 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc4e7b9e-998b-4eae-aa91-c20228510717-kube-api-access\") pod \"cc4e7b9e-998b-4eae-aa91-c20228510717\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.490444 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-kubelet-dir\") pod \"cc4e7b9e-998b-4eae-aa91-c20228510717\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.490478 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-var-lock\") pod \"cc4e7b9e-998b-4eae-aa91-c20228510717\" (UID: \"cc4e7b9e-998b-4eae-aa91-c20228510717\") " Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.490588 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cc4e7b9e-998b-4eae-aa91-c20228510717" (UID: "cc4e7b9e-998b-4eae-aa91-c20228510717"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.490721 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-var-lock" (OuterVolumeSpecName: "var-lock") pod "cc4e7b9e-998b-4eae-aa91-c20228510717" (UID: "cc4e7b9e-998b-4eae-aa91-c20228510717"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.490760 4878 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.500984 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4e7b9e-998b-4eae-aa91-c20228510717-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cc4e7b9e-998b-4eae-aa91-c20228510717" (UID: "cc4e7b9e-998b-4eae-aa91-c20228510717"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.592484 4878 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cc4e7b9e-998b-4eae-aa91-c20228510717-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:47 crc kubenswrapper[4878]: I1202 18:18:47.592534 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc4e7b9e-998b-4eae-aa91-c20228510717-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.017857 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cc4e7b9e-998b-4eae-aa91-c20228510717","Type":"ContainerDied","Data":"dc77a7db2dfdf339994ba86b35e26c92eef90ef49ebf83460a016d72cdf7a1b0"} Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.018231 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc77a7db2dfdf339994ba86b35e26c92eef90ef49ebf83460a016d72cdf7a1b0" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.018316 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 18:18:48 crc kubenswrapper[4878]: E1202 18:18:48.131271 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="1.6s" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.164521 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.170761 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.171809 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.172697 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.173327 4878 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.202545 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.202674 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.202712 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.202722 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.202760 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.202838 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.203157 4878 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.203187 4878 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.203201 4878 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:18:48 crc kubenswrapper[4878]: I1202 18:18:48.945736 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.028351 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.029097 4878 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864" exitCode=0 Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.029174 4878 scope.go:117] "RemoveContainer" containerID="42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.029407 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.031206 4878 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.031745 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.033882 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.034585 4878 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.061735 4878 scope.go:117] "RemoveContainer" containerID="410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.082025 4878 scope.go:117] "RemoveContainer" containerID="e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.101557 4878 scope.go:117] "RemoveContainer" containerID="5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.126592 4878 scope.go:117] "RemoveContainer" containerID="ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.144713 4878 scope.go:117] "RemoveContainer" containerID="150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.178964 4878 scope.go:117] "RemoveContainer" containerID="42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1" Dec 02 18:18:49 crc kubenswrapper[4878]: E1202 18:18:49.180342 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\": container with ID starting with 42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1 not found: ID does not exist" containerID="42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.180385 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1"} err="failed to get container status \"42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\": rpc error: code = NotFound desc = could not find container \"42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1\": container with ID starting with 42e8a7706a5a4377adb359d77464cef1b4e44819f2a2bd76a158fd9bf51d3ec1 not found: ID does not exist" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.180431 4878 scope.go:117] "RemoveContainer" containerID="410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab" Dec 02 18:18:49 crc kubenswrapper[4878]: E1202 18:18:49.182149 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\": container with ID starting with 410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab not found: ID does not exist" containerID="410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.182592 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab"} err="failed to get container status \"410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\": rpc error: code = NotFound desc = could not find container \"410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab\": container with ID starting with 410b07b7c10411c23d88716f644b01436142f20f14ad7c46340653d2c1935cab not found: ID does not exist" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.183099 4878 scope.go:117] "RemoveContainer" containerID="e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc" Dec 02 18:18:49 crc kubenswrapper[4878]: E1202 18:18:49.184457 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\": container with ID starting with e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc not found: ID does not exist" containerID="e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.185153 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc"} err="failed to get container status \"e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\": rpc error: code = NotFound desc = could not find container \"e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc\": container with ID starting with e9a3f89192a088a5bae2fac03f43ec3cccaa622fca3667ea1e757b711689a0dc not found: ID does not exist" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.186023 4878 scope.go:117] "RemoveContainer" containerID="5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003" Dec 02 18:18:49 crc kubenswrapper[4878]: E1202 18:18:49.186700 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\": container with ID starting with 5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003 not found: ID does not exist" containerID="5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.186753 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003"} err="failed to get container status \"5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\": rpc error: code = NotFound desc = could not find container \"5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003\": container with ID starting with 5ede408e020eb9fcf52d5e708e8cab6e2ef6a131910a01b8feb3bcf47c3dd003 not found: ID does not exist" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.186785 4878 scope.go:117] "RemoveContainer" containerID="ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864" Dec 02 18:18:49 crc kubenswrapper[4878]: E1202 18:18:49.187116 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\": container with ID starting with ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864 not found: ID does not exist" containerID="ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.187179 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864"} err="failed to get container status \"ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\": rpc error: code = NotFound desc = could not find container \"ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864\": container with ID starting with ece3ecf95aac884d6eee01491252a20db09d9e8e431cb8428cb3ad91510cb864 not found: ID does not exist" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.187206 4878 scope.go:117] "RemoveContainer" containerID="150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5" Dec 02 18:18:49 crc kubenswrapper[4878]: E1202 18:18:49.187729 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\": container with ID starting with 150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5 not found: ID does not exist" containerID="150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5" Dec 02 18:18:49 crc kubenswrapper[4878]: I1202 18:18:49.187751 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5"} err="failed to get container status \"150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\": rpc error: code = NotFound desc = could not find container \"150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5\": container with ID starting with 150fa2f80b234b974cb5eec22e6a59ec30733776e12ad16c6befba54e130dcc5 not found: ID does not exist" Dec 02 18:18:49 crc kubenswrapper[4878]: E1202 18:18:49.733114 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="3.2s" Dec 02 18:18:50 crc kubenswrapper[4878]: E1202 18:18:50.736357 4878 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:50 crc kubenswrapper[4878]: I1202 18:18:50.736919 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:50 crc kubenswrapper[4878]: E1202 18:18:50.773118 4878 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.159:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d78df9afe1187 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 18:18:50.772484487 +0000 UTC m=+240.462103378,LastTimestamp:2025-12-02 18:18:50.772484487 +0000 UTC m=+240.462103378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 18:18:50 crc kubenswrapper[4878]: I1202 18:18:50.940969 4878 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:50 crc kubenswrapper[4878]: I1202 18:18:50.941310 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:51 crc kubenswrapper[4878]: I1202 18:18:51.046722 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9aef6ac073ae85a01b8df88b56045fa3f3e922cb9992d6b704643e62bd733bc8"} Dec 02 18:18:51 crc kubenswrapper[4878]: I1202 18:18:51.047053 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f27ca7ea929ea5ddd13037fef86def7bcfb2abb9a60e3ba4f995dd1e3a898ddd"} Dec 02 18:18:51 crc kubenswrapper[4878]: E1202 18:18:51.047592 4878 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:18:51 crc kubenswrapper[4878]: I1202 18:18:51.047681 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:52 crc kubenswrapper[4878]: E1202 18:18:52.934480 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="6.4s" Dec 02 18:18:55 crc kubenswrapper[4878]: E1202 18:18:55.216387 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:18:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:18:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:18:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T18:18:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:55 crc kubenswrapper[4878]: E1202 18:18:55.217884 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:55 crc kubenswrapper[4878]: E1202 18:18:55.218222 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:55 crc kubenswrapper[4878]: E1202 18:18:55.218779 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:55 crc kubenswrapper[4878]: E1202 18:18:55.220151 4878 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:55 crc kubenswrapper[4878]: E1202 18:18:55.220312 4878 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 18:18:58 crc kubenswrapper[4878]: I1202 18:18:58.937069 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:58 crc kubenswrapper[4878]: I1202 18:18:58.938639 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:18:58 crc kubenswrapper[4878]: I1202 18:18:58.967486 4878 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:18:58 crc kubenswrapper[4878]: I1202 18:18:58.967556 4878 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:18:58 crc kubenswrapper[4878]: E1202 18:18:58.968471 4878 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:58 crc kubenswrapper[4878]: I1202 18:18:58.969423 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:18:59 crc kubenswrapper[4878]: W1202 18:18:59.004494 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-fc1375ab0a1de6d6a1e2df6f75d68e33a452ac9f8adbddcfc97b52cac07691b7 WatchSource:0}: Error finding container fc1375ab0a1de6d6a1e2df6f75d68e33a452ac9f8adbddcfc97b52cac07691b7: Status 404 returned error can't find the container with id fc1375ab0a1de6d6a1e2df6f75d68e33a452ac9f8adbddcfc97b52cac07691b7 Dec 02 18:18:59 crc kubenswrapper[4878]: I1202 18:18:59.107845 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fc1375ab0a1de6d6a1e2df6f75d68e33a452ac9f8adbddcfc97b52cac07691b7"} Dec 02 18:18:59 crc kubenswrapper[4878]: E1202 18:18:59.335816 4878 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="7s" Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.116149 4878 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="15b753f1633f90b39edfcc17fe961dbaa14918642e169279f29668208adc2290" exitCode=0 Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.116302 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"15b753f1633f90b39edfcc17fe961dbaa14918642e169279f29668208adc2290"} Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.116627 4878 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.116657 4878 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.117080 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:19:00 crc kubenswrapper[4878]: E1202 18:19:00.117220 4878 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.120724 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.120785 4878 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418" exitCode=1 Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.120825 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418"} Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.121324 4878 scope.go:117] "RemoveContainer" containerID="17e84dd515143612c383091caec78ab72f63070606bf0b33c69f847e63f4f418" Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.122593 4878 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:19:00 crc kubenswrapper[4878]: I1202 18:19:00.123216 4878 status_manager.go:851] "Failed to get status for pod" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 02 18:19:00 crc kubenswrapper[4878]: E1202 18:19:00.195387 4878 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.159:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d78df9afe1187 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 18:18:50.772484487 +0000 UTC m=+240.462103378,LastTimestamp:2025-12-02 18:18:50.772484487 +0000 UTC m=+240.462103378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 18:19:01 crc kubenswrapper[4878]: I1202 18:19:01.131441 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d169c8b506985226f368d7fdfd1b5e051972af2948b0c631f7c79adf3ac3f82f"} Dec 02 18:19:01 crc kubenswrapper[4878]: I1202 18:19:01.131932 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d39c3d4c1980479691ea74f24fc344644ba05790aca5b097635c184ae6afa193"} Dec 02 18:19:01 crc kubenswrapper[4878]: I1202 18:19:01.131952 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"709adaa2482402ffefc8277b3070bce45293cc1cd3991d814bae5718edbe5c88"} Dec 02 18:19:01 crc kubenswrapper[4878]: I1202 18:19:01.134684 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 18:19:01 crc kubenswrapper[4878]: I1202 18:19:01.134746 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"13d6b7d18c1532e5f1adfc6ab63ff1ba9f361b24e392bf45686e33aac173ef86"} Dec 02 18:19:02 crc kubenswrapper[4878]: I1202 18:19:02.144200 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fa4553119d166e27806aad895e6754c8c6c6cf349f1a491d666d4ac50aa93abf"} Dec 02 18:19:02 crc kubenswrapper[4878]: I1202 18:19:02.144491 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bfd34e2ccc547b826e5318a8d2533fd5f7c11f8d5851dfd63ab9c40f50e96a06"} Dec 02 18:19:02 crc kubenswrapper[4878]: I1202 18:19:02.144507 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:02 crc kubenswrapper[4878]: I1202 18:19:02.144525 4878 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:02 crc kubenswrapper[4878]: I1202 18:19:02.144544 4878 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:03 crc kubenswrapper[4878]: I1202 18:19:03.710467 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:19:03 crc kubenswrapper[4878]: I1202 18:19:03.970655 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:03 crc kubenswrapper[4878]: I1202 18:19:03.970954 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:03 crc kubenswrapper[4878]: I1202 18:19:03.976845 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:04 crc kubenswrapper[4878]: I1202 18:19:04.477628 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:19:04 crc kubenswrapper[4878]: I1202 18:19:04.484351 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:19:07 crc kubenswrapper[4878]: I1202 18:19:07.161548 4878 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:07 crc kubenswrapper[4878]: I1202 18:19:07.174938 4878 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"129f687d-443c-456f-bc63-99a4784f1464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709adaa2482402ffefc8277b3070bce45293cc1cd3991d814bae5718edbe5c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:19:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d169c8b506985226f368d7fdfd1b5e051972af2948b0c631f7c79adf3ac3f82f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:19:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d39c3d4c1980479691ea74f24fc344644ba05790aca5b097635c184ae6afa193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:19:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4553119d166e27806aad895e6754c8c6c6cf349f1a491d666d4ac50aa93abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:19:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd34e2ccc547b826e5318a8d2533fd5f7c11f8d5851dfd63ab9c40f50e96a06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T18:19:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Dec 02 18:19:08 crc kubenswrapper[4878]: I1202 18:19:08.177982 4878 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:08 crc kubenswrapper[4878]: I1202 18:19:08.178027 4878 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:08 crc kubenswrapper[4878]: I1202 18:19:08.186503 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:08 crc kubenswrapper[4878]: I1202 18:19:08.190914 4878 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="db38c33a-f773-404c-8902-ac28352e5221" Dec 02 18:19:09 crc kubenswrapper[4878]: I1202 18:19:09.183503 4878 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:09 crc kubenswrapper[4878]: I1202 18:19:09.183544 4878 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="129f687d-443c-456f-bc63-99a4784f1464" Dec 02 18:19:10 crc kubenswrapper[4878]: I1202 18:19:10.956148 4878 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="db38c33a-f773-404c-8902-ac28352e5221" Dec 02 18:19:13 crc kubenswrapper[4878]: I1202 18:19:13.715476 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 18:19:17 crc kubenswrapper[4878]: I1202 18:19:17.259840 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 18:19:17 crc kubenswrapper[4878]: I1202 18:19:17.264022 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 18:19:17 crc kubenswrapper[4878]: I1202 18:19:17.291485 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 18:19:18 crc kubenswrapper[4878]: I1202 18:19:18.230534 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 18:19:18 crc kubenswrapper[4878]: I1202 18:19:18.700598 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 18:19:18 crc kubenswrapper[4878]: I1202 18:19:18.874754 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 18:19:18 crc kubenswrapper[4878]: I1202 18:19:18.947670 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.035124 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.075937 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.130380 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.189099 4878 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.257976 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.476476 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.663675 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 18:19:19 crc kubenswrapper[4878]: I1202 18:19:19.829109 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.175557 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.187632 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.208527 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.240120 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.330802 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.388564 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.406057 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.521824 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.587151 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.655737 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.868380 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 18:19:20 crc kubenswrapper[4878]: I1202 18:19:20.973201 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.089288 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.209643 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.214043 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.265599 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.276696 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.371609 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.387812 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.445307 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.679850 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.736316 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.752929 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.753324 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 18:19:21 crc kubenswrapper[4878]: I1202 18:19:21.975455 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.000550 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.032352 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.042107 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.051548 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.074983 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.086088 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.086142 4878 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.164296 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.210718 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.215609 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.219791 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.229459 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.267222 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.286719 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.457410 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.473288 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.496147 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.539771 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.561456 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.665525 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.762227 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.786494 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.814592 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.880990 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.882049 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.894381 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 18:19:22 crc kubenswrapper[4878]: I1202 18:19:22.917626 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.001489 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.254386 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.274275 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.382430 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.618982 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.708461 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.714470 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.832796 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.936159 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 18:19:23 crc kubenswrapper[4878]: I1202 18:19:23.992391 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.006742 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.045638 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.046709 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.047156 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.052456 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.069196 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.124689 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.147523 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.174504 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.185638 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.203163 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.238394 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.275418 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.287808 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.322259 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.425763 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.515373 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.584909 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.614121 4878 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.626497 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.639897 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.645187 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.677689 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.677762 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.734431 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.788295 4878 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.806367 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.818341 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.852151 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.898332 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.899032 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 18:19:24 crc kubenswrapper[4878]: I1202 18:19:24.956268 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.091194 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.127764 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.187056 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.335758 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.378367 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.518379 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.563571 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.576477 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.584772 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.588510 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.606589 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.621427 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.632668 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.659357 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.679033 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.778597 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.783319 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.789398 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.910685 4878 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.915071 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.915118 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.924209 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.945552 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.945523989 podStartE2EDuration="18.945523989s" podCreationTimestamp="2025-12-02 18:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:19:25.94008926 +0000 UTC m=+275.629708181" watchObservedRunningTime="2025-12-02 18:19:25.945523989 +0000 UTC m=+275.635142940" Dec 02 18:19:25 crc kubenswrapper[4878]: I1202 18:19:25.971753 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.030227 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.124924 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.211853 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.227993 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.246480 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.336652 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.353632 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.602368 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.750670 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.765940 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.953907 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 18:19:26 crc kubenswrapper[4878]: I1202 18:19:26.983190 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.048866 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.142683 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.155811 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.284744 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.291221 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.295531 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.297172 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.352386 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.365648 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.403792 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.445644 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.485933 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.492418 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.567720 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.599472 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.641974 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.682621 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.733023 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.840492 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.863265 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.871078 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.873710 4878 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.899663 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.902073 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 18:19:27 crc kubenswrapper[4878]: I1202 18:19:27.975646 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.044390 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.159094 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.214512 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.253130 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.463791 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.542454 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.549914 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.552411 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.584823 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.680220 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.719867 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.799401 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.827568 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.864823 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.914075 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.924977 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 18:19:28 crc kubenswrapper[4878]: I1202 18:19:28.990701 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.011123 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.095870 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.259370 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.290990 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.315440 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.582627 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.591001 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.684852 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.718900 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.764448 4878 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.764814 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9aef6ac073ae85a01b8df88b56045fa3f3e922cb9992d6b704643e62bd733bc8" gracePeriod=5 Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.796105 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.827915 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.875057 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.877335 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 18:19:29 crc kubenswrapper[4878]: I1202 18:19:29.905455 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.030790 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.035120 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.119459 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.155186 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.209460 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.248725 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.255191 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.344708 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.484145 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.505834 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.545088 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.642035 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.665533 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.758617 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.776949 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 18:19:30 crc kubenswrapper[4878]: I1202 18:19:30.838658 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.015993 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.065096 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.070319 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.211361 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.430617 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.447752 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.480832 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.508120 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.509686 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.550032 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.558506 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.569034 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.593605 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.596412 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.726760 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.800530 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.847875 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.853812 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 18:19:31 crc kubenswrapper[4878]: I1202 18:19:31.910120 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.233411 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.276885 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.452226 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.518725 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.518841 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.591084 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.661661 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.805551 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.918381 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.946125 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 18:19:32 crc kubenswrapper[4878]: I1202 18:19:32.967601 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 18:19:33 crc kubenswrapper[4878]: I1202 18:19:33.010212 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 18:19:33 crc kubenswrapper[4878]: I1202 18:19:33.055368 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 18:19:33 crc kubenswrapper[4878]: I1202 18:19:33.082323 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 18:19:33 crc kubenswrapper[4878]: I1202 18:19:33.135972 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 18:19:33 crc kubenswrapper[4878]: I1202 18:19:33.290467 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 18:19:33 crc kubenswrapper[4878]: I1202 18:19:33.697886 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 18:19:33 crc kubenswrapper[4878]: I1202 18:19:33.827455 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 18:19:34 crc kubenswrapper[4878]: I1202 18:19:34.206698 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 18:19:34 crc kubenswrapper[4878]: I1202 18:19:34.674879 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.377401 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.377907 4878 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9aef6ac073ae85a01b8df88b56045fa3f3e922cb9992d6b704643e62bd733bc8" exitCode=137 Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.377965 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27ca7ea929ea5ddd13037fef86def7bcfb2abb9a60e3ba4f995dd1e3a898ddd" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.394811 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.394921 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592462 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592534 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592569 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592645 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592649 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592774 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592796 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592819 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.592922 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.593314 4878 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.593366 4878 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.593391 4878 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.593414 4878 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.606015 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:19:35 crc kubenswrapper[4878]: I1202 18:19:35.694792 4878 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:36 crc kubenswrapper[4878]: I1202 18:19:36.385174 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 18:19:36 crc kubenswrapper[4878]: I1202 18:19:36.949021 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 18:19:56 crc kubenswrapper[4878]: I1202 18:19:56.945204 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g"] Dec 02 18:19:56 crc kubenswrapper[4878]: I1202 18:19:56.945951 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" podUID="bdf471eb-c41c-40c0-b038-5ba883d2154a" containerName="route-controller-manager" containerID="cri-o://94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e" gracePeriod=30 Dec 02 18:19:56 crc kubenswrapper[4878]: I1202 18:19:56.947635 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fvq9v"] Dec 02 18:19:56 crc kubenswrapper[4878]: I1202 18:19:56.947911 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" podUID="435af086-d5fb-4f55-9c52-bfab176ee753" containerName="controller-manager" containerID="cri-o://8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd" gracePeriod=30 Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.304726 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.324305 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-config\") pod \"bdf471eb-c41c-40c0-b038-5ba883d2154a\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.324424 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdf471eb-c41c-40c0-b038-5ba883d2154a-serving-cert\") pod \"bdf471eb-c41c-40c0-b038-5ba883d2154a\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.324461 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg4fw\" (UniqueName: \"kubernetes.io/projected/bdf471eb-c41c-40c0-b038-5ba883d2154a-kube-api-access-sg4fw\") pod \"bdf471eb-c41c-40c0-b038-5ba883d2154a\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.324490 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-client-ca\") pod \"bdf471eb-c41c-40c0-b038-5ba883d2154a\" (UID: \"bdf471eb-c41c-40c0-b038-5ba883d2154a\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.325863 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-client-ca" (OuterVolumeSpecName: "client-ca") pod "bdf471eb-c41c-40c0-b038-5ba883d2154a" (UID: "bdf471eb-c41c-40c0-b038-5ba883d2154a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.326020 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-config" (OuterVolumeSpecName: "config") pod "bdf471eb-c41c-40c0-b038-5ba883d2154a" (UID: "bdf471eb-c41c-40c0-b038-5ba883d2154a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.334808 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf471eb-c41c-40c0-b038-5ba883d2154a-kube-api-access-sg4fw" (OuterVolumeSpecName: "kube-api-access-sg4fw") pod "bdf471eb-c41c-40c0-b038-5ba883d2154a" (UID: "bdf471eb-c41c-40c0-b038-5ba883d2154a"). InnerVolumeSpecName "kube-api-access-sg4fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.341734 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf471eb-c41c-40c0-b038-5ba883d2154a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bdf471eb-c41c-40c0-b038-5ba883d2154a" (UID: "bdf471eb-c41c-40c0-b038-5ba883d2154a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.381385 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.425712 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-client-ca\") pod \"435af086-d5fb-4f55-9c52-bfab176ee753\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.425783 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4gg7\" (UniqueName: \"kubernetes.io/projected/435af086-d5fb-4f55-9c52-bfab176ee753-kube-api-access-j4gg7\") pod \"435af086-d5fb-4f55-9c52-bfab176ee753\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.425816 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-config\") pod \"435af086-d5fb-4f55-9c52-bfab176ee753\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.425856 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435af086-d5fb-4f55-9c52-bfab176ee753-serving-cert\") pod \"435af086-d5fb-4f55-9c52-bfab176ee753\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.425888 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-proxy-ca-bundles\") pod \"435af086-d5fb-4f55-9c52-bfab176ee753\" (UID: \"435af086-d5fb-4f55-9c52-bfab176ee753\") " Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.426978 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "435af086-d5fb-4f55-9c52-bfab176ee753" (UID: "435af086-d5fb-4f55-9c52-bfab176ee753"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.427118 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-config" (OuterVolumeSpecName: "config") pod "435af086-d5fb-4f55-9c52-bfab176ee753" (UID: "435af086-d5fb-4f55-9c52-bfab176ee753"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.427161 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdf471eb-c41c-40c0-b038-5ba883d2154a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.427185 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg4fw\" (UniqueName: \"kubernetes.io/projected/bdf471eb-c41c-40c0-b038-5ba883d2154a-kube-api-access-sg4fw\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.427201 4878 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.427218 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf471eb-c41c-40c0-b038-5ba883d2154a-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.427985 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-client-ca" (OuterVolumeSpecName: "client-ca") pod "435af086-d5fb-4f55-9c52-bfab176ee753" (UID: "435af086-d5fb-4f55-9c52-bfab176ee753"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.429820 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/435af086-d5fb-4f55-9c52-bfab176ee753-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "435af086-d5fb-4f55-9c52-bfab176ee753" (UID: "435af086-d5fb-4f55-9c52-bfab176ee753"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.430362 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435af086-d5fb-4f55-9c52-bfab176ee753-kube-api-access-j4gg7" (OuterVolumeSpecName: "kube-api-access-j4gg7") pod "435af086-d5fb-4f55-9c52-bfab176ee753" (UID: "435af086-d5fb-4f55-9c52-bfab176ee753"). InnerVolumeSpecName "kube-api-access-j4gg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528112 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528153 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/435af086-d5fb-4f55-9c52-bfab176ee753-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528164 4878 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528177 4878 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/435af086-d5fb-4f55-9c52-bfab176ee753-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528187 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4gg7\" (UniqueName: \"kubernetes.io/projected/435af086-d5fb-4f55-9c52-bfab176ee753-kube-api-access-j4gg7\") on node \"crc\" DevicePath \"\"" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528321 4878 generic.go:334] "Generic (PLEG): container finished" podID="435af086-d5fb-4f55-9c52-bfab176ee753" containerID="8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd" exitCode=0 Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528389 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528391 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" event={"ID":"435af086-d5fb-4f55-9c52-bfab176ee753","Type":"ContainerDied","Data":"8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd"} Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528796 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fvq9v" event={"ID":"435af086-d5fb-4f55-9c52-bfab176ee753","Type":"ContainerDied","Data":"0f90b3743bbf5489db2b37a30cae5a59611feb48fe7412fba22fca819cd9957b"} Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.528823 4878 scope.go:117] "RemoveContainer" containerID="8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.532680 4878 generic.go:334] "Generic (PLEG): container finished" podID="bdf471eb-c41c-40c0-b038-5ba883d2154a" containerID="94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e" exitCode=0 Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.532733 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.532733 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" event={"ID":"bdf471eb-c41c-40c0-b038-5ba883d2154a","Type":"ContainerDied","Data":"94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e"} Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.532861 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g" event={"ID":"bdf471eb-c41c-40c0-b038-5ba883d2154a","Type":"ContainerDied","Data":"9847c6b9b3b4dd4ebf1efacb725ef7a7796166f8f32ca9e155986dab0e4287e7"} Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.555372 4878 scope.go:117] "RemoveContainer" containerID="8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd" Dec 02 18:19:57 crc kubenswrapper[4878]: E1202 18:19:57.559513 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd\": container with ID starting with 8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd not found: ID does not exist" containerID="8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.559575 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd"} err="failed to get container status \"8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd\": rpc error: code = NotFound desc = could not find container \"8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd\": container with ID starting with 8578fdd7425efacd7333511189c889c6a75f887c1ec1c3caab4d8d04c7c40abd not found: ID does not exist" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.559618 4878 scope.go:117] "RemoveContainer" containerID="94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.567555 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fvq9v"] Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.587284 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fvq9v"] Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.593610 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g"] Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.599135 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l8z4g"] Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.599762 4878 scope.go:117] "RemoveContainer" containerID="94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e" Dec 02 18:19:57 crc kubenswrapper[4878]: E1202 18:19:57.602439 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e\": container with ID starting with 94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e not found: ID does not exist" containerID="94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e" Dec 02 18:19:57 crc kubenswrapper[4878]: I1202 18:19:57.602580 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e"} err="failed to get container status \"94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e\": rpc error: code = NotFound desc = could not find container \"94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e\": container with ID starting with 94d633c3e08379b4b4e044cd306b879c4d738670f8387634034aacd956b38c1e not found: ID does not exist" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.768452 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66494666dc-k5nzt"] Dec 02 18:19:58 crc kubenswrapper[4878]: E1202 18:19:58.769472 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769499 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 18:19:58 crc kubenswrapper[4878]: E1202 18:19:58.769522 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" containerName="installer" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769536 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" containerName="installer" Dec 02 18:19:58 crc kubenswrapper[4878]: E1202 18:19:58.769580 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf471eb-c41c-40c0-b038-5ba883d2154a" containerName="route-controller-manager" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769594 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf471eb-c41c-40c0-b038-5ba883d2154a" containerName="route-controller-manager" Dec 02 18:19:58 crc kubenswrapper[4878]: E1202 18:19:58.769613 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435af086-d5fb-4f55-9c52-bfab176ee753" containerName="controller-manager" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769625 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="435af086-d5fb-4f55-9c52-bfab176ee753" containerName="controller-manager" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769805 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769828 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4e7b9e-998b-4eae-aa91-c20228510717" containerName="installer" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769857 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf471eb-c41c-40c0-b038-5ba883d2154a" containerName="route-controller-manager" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.769874 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="435af086-d5fb-4f55-9c52-bfab176ee753" containerName="controller-manager" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.770695 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.773093 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.773655 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.773927 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.774203 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.774453 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.776094 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj"] Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.777507 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.781420 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.781921 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.781958 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.782124 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.782277 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.782814 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.785317 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.785585 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.798662 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66494666dc-k5nzt"] Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.820219 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj"] Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947212 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-client-ca\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947303 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-config\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947360 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ct9\" (UniqueName: \"kubernetes.io/projected/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-kube-api-access-r8ct9\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947402 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/093bd41a-5b89-48e8-a3a2-36835fe785c7-serving-cert\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947436 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-serving-cert\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947493 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-proxy-ca-bundles\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947525 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59896\" (UniqueName: \"kubernetes.io/projected/093bd41a-5b89-48e8-a3a2-36835fe785c7-kube-api-access-59896\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947712 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-config\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.947760 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-client-ca\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.948801 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435af086-d5fb-4f55-9c52-bfab176ee753" path="/var/lib/kubelet/pods/435af086-d5fb-4f55-9c52-bfab176ee753/volumes" Dec 02 18:19:58 crc kubenswrapper[4878]: I1202 18:19:58.949755 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf471eb-c41c-40c0-b038-5ba883d2154a" path="/var/lib/kubelet/pods/bdf471eb-c41c-40c0-b038-5ba883d2154a/volumes" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049456 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ct9\" (UniqueName: \"kubernetes.io/projected/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-kube-api-access-r8ct9\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049607 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/093bd41a-5b89-48e8-a3a2-36835fe785c7-serving-cert\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049655 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-serving-cert\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049694 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-proxy-ca-bundles\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049729 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59896\" (UniqueName: \"kubernetes.io/projected/093bd41a-5b89-48e8-a3a2-36835fe785c7-kube-api-access-59896\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049767 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-config\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049811 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-client-ca\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049927 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-client-ca\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.049971 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-config\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.051600 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-client-ca\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.051816 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-proxy-ca-bundles\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.052358 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-client-ca\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.052511 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-config\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.052658 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-config\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.060060 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/093bd41a-5b89-48e8-a3a2-36835fe785c7-serving-cert\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.065812 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-serving-cert\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.077656 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ct9\" (UniqueName: \"kubernetes.io/projected/dc55246a-ef67-4c0c-a1bd-e3b753ca014b-kube-api-access-r8ct9\") pod \"route-controller-manager-65ff6c5d6f-lrsfj\" (UID: \"dc55246a-ef67-4c0c-a1bd-e3b753ca014b\") " pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.085825 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59896\" (UniqueName: \"kubernetes.io/projected/093bd41a-5b89-48e8-a3a2-36835fe785c7-kube-api-access-59896\") pod \"controller-manager-66494666dc-k5nzt\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.102083 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.116724 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.356078 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66494666dc-k5nzt"] Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.418155 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj"] Dec 02 18:19:59 crc kubenswrapper[4878]: W1202 18:19:59.440434 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc55246a_ef67_4c0c_a1bd_e3b753ca014b.slice/crio-98c1e5e2efc32661978eab409407c5ff6a38c0bbd8dae6ae0c69669e4f798a68 WatchSource:0}: Error finding container 98c1e5e2efc32661978eab409407c5ff6a38c0bbd8dae6ae0c69669e4f798a68: Status 404 returned error can't find the container with id 98c1e5e2efc32661978eab409407c5ff6a38c0bbd8dae6ae0c69669e4f798a68 Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.550781 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" event={"ID":"093bd41a-5b89-48e8-a3a2-36835fe785c7","Type":"ContainerStarted","Data":"853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a"} Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.550824 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" event={"ID":"093bd41a-5b89-48e8-a3a2-36835fe785c7","Type":"ContainerStarted","Data":"2d28102952556f4655b13d323513a6495ef3ec415a5eec4649278b3bb10b63b2"} Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.551816 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.552987 4878 patch_prober.go:28] interesting pod/controller-manager-66494666dc-k5nzt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.553040 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" podUID="093bd41a-5b89-48e8-a3a2-36835fe785c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.555285 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" event={"ID":"dc55246a-ef67-4c0c-a1bd-e3b753ca014b","Type":"ContainerStarted","Data":"ade71123538a41f4f9d517a65cdbabd217fcd27e9bb72034537b19817589eb3c"} Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.555315 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" event={"ID":"dc55246a-ef67-4c0c-a1bd-e3b753ca014b","Type":"ContainerStarted","Data":"98c1e5e2efc32661978eab409407c5ff6a38c0bbd8dae6ae0c69669e4f798a68"} Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.555556 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.556722 4878 patch_prober.go:28] interesting pod/route-controller-manager-65ff6c5d6f-lrsfj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.556755 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" podUID="dc55246a-ef67-4c0c-a1bd-e3b753ca014b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.570987 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" podStartSLOduration=2.570957076 podStartE2EDuration="2.570957076s" podCreationTimestamp="2025-12-02 18:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:19:59.567023375 +0000 UTC m=+309.256642256" watchObservedRunningTime="2025-12-02 18:19:59.570957076 +0000 UTC m=+309.260575957" Dec 02 18:19:59 crc kubenswrapper[4878]: I1202 18:19:59.587098 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" podStartSLOduration=2.587072579 podStartE2EDuration="2.587072579s" podCreationTimestamp="2025-12-02 18:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:19:59.582397095 +0000 UTC m=+309.272015976" watchObservedRunningTime="2025-12-02 18:19:59.587072579 +0000 UTC m=+309.276691450" Dec 02 18:20:00 crc kubenswrapper[4878]: I1202 18:20:00.569174 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:20:00 crc kubenswrapper[4878]: I1202 18:20:00.570867 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65ff6c5d6f-lrsfj" Dec 02 18:20:16 crc kubenswrapper[4878]: I1202 18:20:16.852198 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66494666dc-k5nzt"] Dec 02 18:20:16 crc kubenswrapper[4878]: I1202 18:20:16.852782 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" podUID="093bd41a-5b89-48e8-a3a2-36835fe785c7" containerName="controller-manager" containerID="cri-o://853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a" gracePeriod=30 Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.421048 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.614708 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-config\") pod \"093bd41a-5b89-48e8-a3a2-36835fe785c7\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.615441 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59896\" (UniqueName: \"kubernetes.io/projected/093bd41a-5b89-48e8-a3a2-36835fe785c7-kube-api-access-59896\") pod \"093bd41a-5b89-48e8-a3a2-36835fe785c7\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.615553 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-client-ca\") pod \"093bd41a-5b89-48e8-a3a2-36835fe785c7\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.615588 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-proxy-ca-bundles\") pod \"093bd41a-5b89-48e8-a3a2-36835fe785c7\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.615626 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/093bd41a-5b89-48e8-a3a2-36835fe785c7-serving-cert\") pod \"093bd41a-5b89-48e8-a3a2-36835fe785c7\" (UID: \"093bd41a-5b89-48e8-a3a2-36835fe785c7\") " Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.616098 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "093bd41a-5b89-48e8-a3a2-36835fe785c7" (UID: "093bd41a-5b89-48e8-a3a2-36835fe785c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.616505 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "093bd41a-5b89-48e8-a3a2-36835fe785c7" (UID: "093bd41a-5b89-48e8-a3a2-36835fe785c7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.616641 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-config" (OuterVolumeSpecName: "config") pod "093bd41a-5b89-48e8-a3a2-36835fe785c7" (UID: "093bd41a-5b89-48e8-a3a2-36835fe785c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.621174 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093bd41a-5b89-48e8-a3a2-36835fe785c7-kube-api-access-59896" (OuterVolumeSpecName: "kube-api-access-59896") pod "093bd41a-5b89-48e8-a3a2-36835fe785c7" (UID: "093bd41a-5b89-48e8-a3a2-36835fe785c7"). InnerVolumeSpecName "kube-api-access-59896". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.621299 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093bd41a-5b89-48e8-a3a2-36835fe785c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "093bd41a-5b89-48e8-a3a2-36835fe785c7" (UID: "093bd41a-5b89-48e8-a3a2-36835fe785c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.664827 4878 generic.go:334] "Generic (PLEG): container finished" podID="093bd41a-5b89-48e8-a3a2-36835fe785c7" containerID="853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a" exitCode=0 Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.664892 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" event={"ID":"093bd41a-5b89-48e8-a3a2-36835fe785c7","Type":"ContainerDied","Data":"853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a"} Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.664932 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" event={"ID":"093bd41a-5b89-48e8-a3a2-36835fe785c7","Type":"ContainerDied","Data":"2d28102952556f4655b13d323513a6495ef3ec415a5eec4649278b3bb10b63b2"} Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.664961 4878 scope.go:117] "RemoveContainer" containerID="853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.665133 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66494666dc-k5nzt" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.700771 4878 scope.go:117] "RemoveContainer" containerID="853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a" Dec 02 18:20:17 crc kubenswrapper[4878]: E1202 18:20:17.701886 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a\": container with ID starting with 853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a not found: ID does not exist" containerID="853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.701951 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a"} err="failed to get container status \"853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a\": rpc error: code = NotFound desc = could not find container \"853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a\": container with ID starting with 853fb6ab3c548a60599a979e9c58aaa205334d73773120ed6fd0878bc0d8716a not found: ID does not exist" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.716726 4878 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.716764 4878 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/093bd41a-5b89-48e8-a3a2-36835fe785c7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.716777 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.716791 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59896\" (UniqueName: \"kubernetes.io/projected/093bd41a-5b89-48e8-a3a2-36835fe785c7-kube-api-access-59896\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.716806 4878 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/093bd41a-5b89-48e8-a3a2-36835fe785c7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.721322 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66494666dc-k5nzt"] Dec 02 18:20:17 crc kubenswrapper[4878]: I1202 18:20:17.730854 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66494666dc-k5nzt"] Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.781770 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75f5955786-h95l5"] Dec 02 18:20:18 crc kubenswrapper[4878]: E1202 18:20:18.782196 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093bd41a-5b89-48e8-a3a2-36835fe785c7" containerName="controller-manager" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.782226 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="093bd41a-5b89-48e8-a3a2-36835fe785c7" containerName="controller-manager" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.782528 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="093bd41a-5b89-48e8-a3a2-36835fe785c7" containerName="controller-manager" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.783385 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.786914 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.788855 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.789117 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.789467 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.789725 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.795041 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75f5955786-h95l5"] Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.797398 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.799850 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.843833 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-proxy-ca-bundles\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.843895 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-client-ca\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.843967 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-config\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.843995 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknt4\" (UniqueName: \"kubernetes.io/projected/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-kube-api-access-hknt4\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.844035 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-serving-cert\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.945684 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-client-ca\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.945851 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-config\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.945921 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hknt4\" (UniqueName: \"kubernetes.io/projected/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-kube-api-access-hknt4\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.945974 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-serving-cert\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.946030 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-proxy-ca-bundles\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.946646 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093bd41a-5b89-48e8-a3a2-36835fe785c7" path="/var/lib/kubelet/pods/093bd41a-5b89-48e8-a3a2-36835fe785c7/volumes" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.948226 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-proxy-ca-bundles\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.948534 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-config\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.948601 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-client-ca\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.951803 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-serving-cert\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:18 crc kubenswrapper[4878]: I1202 18:20:18.975609 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknt4\" (UniqueName: \"kubernetes.io/projected/fdb0ce33-6c16-4b78-8e98-014b9b9af34f-kube-api-access-hknt4\") pod \"controller-manager-75f5955786-h95l5\" (UID: \"fdb0ce33-6c16-4b78-8e98-014b9b9af34f\") " pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:19 crc kubenswrapper[4878]: I1202 18:20:19.162072 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:19 crc kubenswrapper[4878]: I1202 18:20:19.657729 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75f5955786-h95l5"] Dec 02 18:20:19 crc kubenswrapper[4878]: I1202 18:20:19.681482 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" event={"ID":"fdb0ce33-6c16-4b78-8e98-014b9b9af34f","Type":"ContainerStarted","Data":"f5e47ec93c4b24fe989ecaab3908f562b126234f9ebdbf42282cbfdbadeda6b9"} Dec 02 18:20:20 crc kubenswrapper[4878]: I1202 18:20:20.690734 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" event={"ID":"fdb0ce33-6c16-4b78-8e98-014b9b9af34f","Type":"ContainerStarted","Data":"f5d1354e736167588ad1ccd471f249889bcaf4863da1c664fb70dd3a0197da60"} Dec 02 18:20:20 crc kubenswrapper[4878]: I1202 18:20:20.691201 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:20 crc kubenswrapper[4878]: I1202 18:20:20.700196 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" Dec 02 18:20:20 crc kubenswrapper[4878]: I1202 18:20:20.711648 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75f5955786-h95l5" podStartSLOduration=4.711620907 podStartE2EDuration="4.711620907s" podCreationTimestamp="2025-12-02 18:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:20:20.709766097 +0000 UTC m=+330.399385068" watchObservedRunningTime="2025-12-02 18:20:20.711620907 +0000 UTC m=+330.401239828" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.306488 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwf9n"] Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.307548 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kwf9n" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="registry-server" containerID="cri-o://ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf" gracePeriod=30 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.320003 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb7zn"] Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.320538 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sb7zn" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="registry-server" containerID="cri-o://2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a" gracePeriod=30 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.333626 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5p678"] Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.333872 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" podUID="38d2716d-f5de-4242-a170-624490092b98" containerName="marketplace-operator" containerID="cri-o://65416adb2286d96d2e3dde6449c968fd6c8744a0af2ff339a7a5f9ac81c70a61" gracePeriod=30 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.350451 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m7gk"] Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.350795 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7m7gk" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="registry-server" containerID="cri-o://6951d51d80c8fdd4f2edb0740e074a13d1358c666025922626cff77a47fe77d2" gracePeriod=30 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.365122 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2c8h"] Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.365485 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v2c8h" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="registry-server" containerID="cri-o://19899a9328cfe34054cb3ed9596b9fad20af11c32473723375c1644424d79ebd" gracePeriod=30 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.371601 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8xkxg"] Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.372565 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.377589 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8xkxg"] Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.389683 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a is running failed: container process not found" containerID="2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.390330 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a is running failed: container process not found" containerID="2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.390693 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a is running failed: container process not found" containerID="2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.390734 4878 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-sb7zn" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="registry-server" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.547812 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6g55\" (UniqueName: \"kubernetes.io/projected/11c10f5a-0137-467d-a749-1bce1c6210ed-kube-api-access-s6g55\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.547897 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/11c10f5a-0137-467d-a749-1bce1c6210ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.547957 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11c10f5a-0137-467d-a749-1bce1c6210ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.576572 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf is running failed: container process not found" containerID="ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.577104 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf is running failed: container process not found" containerID="ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.578074 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf is running failed: container process not found" containerID="ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 18:20:32 crc kubenswrapper[4878]: E1202 18:20:32.578186 4878 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-kwf9n" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="registry-server" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.649085 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6g55\" (UniqueName: \"kubernetes.io/projected/11c10f5a-0137-467d-a749-1bce1c6210ed-kube-api-access-s6g55\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.649130 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/11c10f5a-0137-467d-a749-1bce1c6210ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.649165 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11c10f5a-0137-467d-a749-1bce1c6210ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.650217 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11c10f5a-0137-467d-a749-1bce1c6210ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.657300 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/11c10f5a-0137-467d-a749-1bce1c6210ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.669456 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6g55\" (UniqueName: \"kubernetes.io/projected/11c10f5a-0137-467d-a749-1bce1c6210ed-kube-api-access-s6g55\") pod \"marketplace-operator-79b997595-8xkxg\" (UID: \"11c10f5a-0137-467d-a749-1bce1c6210ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.777879 4878 generic.go:334] "Generic (PLEG): container finished" podID="e80c5f56-57a3-4778-8382-473cd7678252" containerID="ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf" exitCode=0 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.777961 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwf9n" event={"ID":"e80c5f56-57a3-4778-8382-473cd7678252","Type":"ContainerDied","Data":"ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf"} Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.779585 4878 generic.go:334] "Generic (PLEG): container finished" podID="38d2716d-f5de-4242-a170-624490092b98" containerID="65416adb2286d96d2e3dde6449c968fd6c8744a0af2ff339a7a5f9ac81c70a61" exitCode=0 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.779639 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" event={"ID":"38d2716d-f5de-4242-a170-624490092b98","Type":"ContainerDied","Data":"65416adb2286d96d2e3dde6449c968fd6c8744a0af2ff339a7a5f9ac81c70a61"} Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.781886 4878 generic.go:334] "Generic (PLEG): container finished" podID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerID="19899a9328cfe34054cb3ed9596b9fad20af11c32473723375c1644424d79ebd" exitCode=0 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.781947 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2c8h" event={"ID":"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8","Type":"ContainerDied","Data":"19899a9328cfe34054cb3ed9596b9fad20af11c32473723375c1644424d79ebd"} Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.785370 4878 generic.go:334] "Generic (PLEG): container finished" podID="75d2e014-a578-4394-969c-109c2a260296" containerID="2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a" exitCode=0 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.785439 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7zn" event={"ID":"75d2e014-a578-4394-969c-109c2a260296","Type":"ContainerDied","Data":"2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a"} Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.787228 4878 generic.go:334] "Generic (PLEG): container finished" podID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerID="6951d51d80c8fdd4f2edb0740e074a13d1358c666025922626cff77a47fe77d2" exitCode=0 Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.787273 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m7gk" event={"ID":"c3d67205-fea8-475d-b3da-bd4fc55a58c4","Type":"ContainerDied","Data":"6951d51d80c8fdd4f2edb0740e074a13d1358c666025922626cff77a47fe77d2"} Dec 02 18:20:32 crc kubenswrapper[4878]: I1202 18:20:32.826399 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.003543 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.117976 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.124075 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.128030 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164047 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsxxr\" (UniqueName: \"kubernetes.io/projected/e80c5f56-57a3-4778-8382-473cd7678252-kube-api-access-bsxxr\") pod \"e80c5f56-57a3-4778-8382-473cd7678252\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164108 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38d2716d-f5de-4242-a170-624490092b98-marketplace-operator-metrics\") pod \"38d2716d-f5de-4242-a170-624490092b98\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164171 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-utilities\") pod \"e80c5f56-57a3-4778-8382-473cd7678252\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164201 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q44qr\" (UniqueName: \"kubernetes.io/projected/75d2e014-a578-4394-969c-109c2a260296-kube-api-access-q44qr\") pod \"75d2e014-a578-4394-969c-109c2a260296\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164267 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkmws\" (UniqueName: \"kubernetes.io/projected/38d2716d-f5de-4242-a170-624490092b98-kube-api-access-fkmws\") pod \"38d2716d-f5de-4242-a170-624490092b98\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164318 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-catalog-content\") pod \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164487 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38d2716d-f5de-4242-a170-624490092b98-marketplace-trusted-ca\") pod \"38d2716d-f5de-4242-a170-624490092b98\" (UID: \"38d2716d-f5de-4242-a170-624490092b98\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164580 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-catalog-content\") pod \"e80c5f56-57a3-4778-8382-473cd7678252\" (UID: \"e80c5f56-57a3-4778-8382-473cd7678252\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164657 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-utilities\") pod \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.164714 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-utilities\") pod \"75d2e014-a578-4394-969c-109c2a260296\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.165809 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-utilities" (OuterVolumeSpecName: "utilities") pod "e80c5f56-57a3-4778-8382-473cd7678252" (UID: "e80c5f56-57a3-4778-8382-473cd7678252"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.167162 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-utilities" (OuterVolumeSpecName: "utilities") pod "e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" (UID: "e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.167639 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-utilities" (OuterVolumeSpecName: "utilities") pod "75d2e014-a578-4394-969c-109c2a260296" (UID: "75d2e014-a578-4394-969c-109c2a260296"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.172584 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d2e014-a578-4394-969c-109c2a260296-kube-api-access-q44qr" (OuterVolumeSpecName: "kube-api-access-q44qr") pod "75d2e014-a578-4394-969c-109c2a260296" (UID: "75d2e014-a578-4394-969c-109c2a260296"). InnerVolumeSpecName "kube-api-access-q44qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.174672 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d2716d-f5de-4242-a170-624490092b98-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "38d2716d-f5de-4242-a170-624490092b98" (UID: "38d2716d-f5de-4242-a170-624490092b98"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.175937 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d2716d-f5de-4242-a170-624490092b98-kube-api-access-fkmws" (OuterVolumeSpecName: "kube-api-access-fkmws") pod "38d2716d-f5de-4242-a170-624490092b98" (UID: "38d2716d-f5de-4242-a170-624490092b98"). InnerVolumeSpecName "kube-api-access-fkmws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.176826 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80c5f56-57a3-4778-8382-473cd7678252-kube-api-access-bsxxr" (OuterVolumeSpecName: "kube-api-access-bsxxr") pod "e80c5f56-57a3-4778-8382-473cd7678252" (UID: "e80c5f56-57a3-4778-8382-473cd7678252"). InnerVolumeSpecName "kube-api-access-bsxxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.188831 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38d2716d-f5de-4242-a170-624490092b98-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "38d2716d-f5de-4242-a170-624490092b98" (UID: "38d2716d-f5de-4242-a170-624490092b98"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.198862 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.247109 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e80c5f56-57a3-4778-8382-473cd7678252" (UID: "e80c5f56-57a3-4778-8382-473cd7678252"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265370 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw6dd\" (UniqueName: \"kubernetes.io/projected/c3d67205-fea8-475d-b3da-bd4fc55a58c4-kube-api-access-rw6dd\") pod \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265438 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-catalog-content\") pod \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265470 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqbgq\" (UniqueName: \"kubernetes.io/projected/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-kube-api-access-sqbgq\") pod \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\" (UID: \"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265489 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-utilities\") pod \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\" (UID: \"c3d67205-fea8-475d-b3da-bd4fc55a58c4\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265509 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-catalog-content\") pod \"75d2e014-a578-4394-969c-109c2a260296\" (UID: \"75d2e014-a578-4394-969c-109c2a260296\") " Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265643 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkmws\" (UniqueName: \"kubernetes.io/projected/38d2716d-f5de-4242-a170-624490092b98-kube-api-access-fkmws\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265655 4878 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38d2716d-f5de-4242-a170-624490092b98-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265665 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265674 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265683 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265691 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsxxr\" (UniqueName: \"kubernetes.io/projected/e80c5f56-57a3-4778-8382-473cd7678252-kube-api-access-bsxxr\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265700 4878 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38d2716d-f5de-4242-a170-624490092b98-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265709 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e80c5f56-57a3-4778-8382-473cd7678252-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.265717 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q44qr\" (UniqueName: \"kubernetes.io/projected/75d2e014-a578-4394-969c-109c2a260296-kube-api-access-q44qr\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.266176 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-utilities" (OuterVolumeSpecName: "utilities") pod "c3d67205-fea8-475d-b3da-bd4fc55a58c4" (UID: "c3d67205-fea8-475d-b3da-bd4fc55a58c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.268780 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d67205-fea8-475d-b3da-bd4fc55a58c4-kube-api-access-rw6dd" (OuterVolumeSpecName: "kube-api-access-rw6dd") pod "c3d67205-fea8-475d-b3da-bd4fc55a58c4" (UID: "c3d67205-fea8-475d-b3da-bd4fc55a58c4"). InnerVolumeSpecName "kube-api-access-rw6dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.269400 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-kube-api-access-sqbgq" (OuterVolumeSpecName: "kube-api-access-sqbgq") pod "e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" (UID: "e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8"). InnerVolumeSpecName "kube-api-access-sqbgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.287450 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3d67205-fea8-475d-b3da-bd4fc55a58c4" (UID: "c3d67205-fea8-475d-b3da-bd4fc55a58c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.332350 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75d2e014-a578-4394-969c-109c2a260296" (UID: "75d2e014-a578-4394-969c-109c2a260296"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.344657 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" (UID: "e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.367592 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.367667 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqbgq\" (UniqueName: \"kubernetes.io/projected/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-kube-api-access-sqbgq\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.367710 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d67205-fea8-475d-b3da-bd4fc55a58c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.367725 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d2e014-a578-4394-969c-109c2a260296-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.367739 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.367752 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw6dd\" (UniqueName: \"kubernetes.io/projected/c3d67205-fea8-475d-b3da-bd4fc55a58c4-kube-api-access-rw6dd\") on node \"crc\" DevicePath \"\"" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.472980 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8xkxg"] Dec 02 18:20:33 crc kubenswrapper[4878]: W1202 18:20:33.490528 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11c10f5a_0137_467d_a749_1bce1c6210ed.slice/crio-33f1d649e857394c65988cf9e8854d74e09ef8f6386815220a3c10882bd8f7f3 WatchSource:0}: Error finding container 33f1d649e857394c65988cf9e8854d74e09ef8f6386815220a3c10882bd8f7f3: Status 404 returned error can't find the container with id 33f1d649e857394c65988cf9e8854d74e09ef8f6386815220a3c10882bd8f7f3 Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.796865 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" event={"ID":"11c10f5a-0137-467d-a749-1bce1c6210ed","Type":"ContainerStarted","Data":"0dc66a777c9131c4c7025dca18e2efbd009fa87a7568710d2de7c931d73e09ae"} Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.797491 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" event={"ID":"11c10f5a-0137-467d-a749-1bce1c6210ed","Type":"ContainerStarted","Data":"33f1d649e857394c65988cf9e8854d74e09ef8f6386815220a3c10882bd8f7f3"} Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.797636 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.799297 4878 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8xkxg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.61:8080/healthz\": dial tcp 10.217.0.61:8080: connect: connection refused" start-of-body= Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.799424 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" podUID="11c10f5a-0137-467d-a749-1bce1c6210ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.61:8080/healthz\": dial tcp 10.217.0.61:8080: connect: connection refused" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.800508 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwf9n" event={"ID":"e80c5f56-57a3-4778-8382-473cd7678252","Type":"ContainerDied","Data":"1640ce8adf5c9b2566eb4114ad2f33a2e9a9acdeb1e91a22ecbf83b5215b7952"} Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.800536 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwf9n" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.800619 4878 scope.go:117] "RemoveContainer" containerID="ab9ef5d41dec1b3f0042d9140829b692e172af30bebcfc7dda5c9c8612d7cadf" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.802443 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" event={"ID":"38d2716d-f5de-4242-a170-624490092b98","Type":"ContainerDied","Data":"7a41ee2a2168b5f3f054c8cefb238051568904083b38278b2ee90dae542cf3c8"} Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.802457 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5p678" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.805553 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2c8h" event={"ID":"e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8","Type":"ContainerDied","Data":"fcf0059d7f7087a35f5e3588f1d976ff4863cf53bb43045ad4d24ccb32a2b3e7"} Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.805631 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2c8h" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.822498 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7zn" event={"ID":"75d2e014-a578-4394-969c-109c2a260296","Type":"ContainerDied","Data":"eeb30a1d45928d1454cf44ddbe0e664f63a38a8827e69fc5f2d37e1e94277cd3"} Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.822966 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7zn" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.830999 4878 scope.go:117] "RemoveContainer" containerID="4ae69f9012dbb1ccdfb85cca33a1eeae8f73a5bdfd500d72b35c33099cc420ef" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.838579 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m7gk" event={"ID":"c3d67205-fea8-475d-b3da-bd4fc55a58c4","Type":"ContainerDied","Data":"0c6147f198d32f8567d7e4397b90137ee1c7d4ebc6a7ef03c22b1b76f33659b2"} Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.839428 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m7gk" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.844866 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" podStartSLOduration=1.844848531 podStartE2EDuration="1.844848531s" podCreationTimestamp="2025-12-02 18:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:20:33.818871161 +0000 UTC m=+343.508490042" watchObservedRunningTime="2025-12-02 18:20:33.844848531 +0000 UTC m=+343.534467412" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.870839 4878 scope.go:117] "RemoveContainer" containerID="e0b035add7241ae6198680289ec4704519f33d9125d60f3c131feb0dab6f0127" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.915800 4878 scope.go:117] "RemoveContainer" containerID="65416adb2286d96d2e3dde6449c968fd6c8744a0af2ff339a7a5f9ac81c70a61" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.916925 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwf9n"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.923965 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kwf9n"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.934339 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5p678"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.938961 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5p678"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.946164 4878 scope.go:117] "RemoveContainer" containerID="19899a9328cfe34054cb3ed9596b9fad20af11c32473723375c1644424d79ebd" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.950187 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2c8h"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.967345 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v2c8h"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.969847 4878 scope.go:117] "RemoveContainer" containerID="6656d76310ea3083e0912302a18a81128e753210f7305fc7e5f115c07d32185a" Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.972768 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m7gk"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.977567 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m7gk"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.981549 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb7zn"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.987600 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sb7zn"] Dec 02 18:20:33 crc kubenswrapper[4878]: I1202 18:20:33.990747 4878 scope.go:117] "RemoveContainer" containerID="27e28c5e8a69fe690b429046479f38c7c2cc65a05c91689380e4961c4582c061" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.009326 4878 scope.go:117] "RemoveContainer" containerID="2570250897e42d2c32efa1125e626614c36dc21001f7c2f484724d8b36f5ea8a" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.025110 4878 scope.go:117] "RemoveContainer" containerID="d9730355b9e8f808f0b72c022c8230b7d633a7b2036bc72e1f83efcaab62f932" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.047422 4878 scope.go:117] "RemoveContainer" containerID="ea5183a479858660c55e92b6da435fab977b3b0d13347558f4f87daa740bbeb2" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.068175 4878 scope.go:117] "RemoveContainer" containerID="6951d51d80c8fdd4f2edb0740e074a13d1358c666025922626cff77a47fe77d2" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.083988 4878 scope.go:117] "RemoveContainer" containerID="0629ebb673a14bf094bd667066d2bd3788746440a908e4104f16e1332bcf7ad0" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.103501 4878 scope.go:117] "RemoveContainer" containerID="11ad334364d0957b94c3922ca1eb918aabbc6c1c2a869bfa05d4a7516b1ce4c3" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675296 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vmxwf"] Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675618 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675639 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675664 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675681 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675702 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675717 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675732 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675745 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675763 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675776 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675795 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675808 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675828 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675842 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675869 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675884 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675902 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675916 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675934 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675946 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675963 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.675978 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="extract-utilities" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.675995 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d2716d-f5de-4242-a170-624490092b98" containerName="marketplace-operator" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.676008 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d2716d-f5de-4242-a170-624490092b98" containerName="marketplace-operator" Dec 02 18:20:34 crc kubenswrapper[4878]: E1202 18:20:34.676028 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.676042 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="extract-content" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.676226 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d2716d-f5de-4242-a170-624490092b98" containerName="marketplace-operator" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.676281 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.676750 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.676771 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d2e014-a578-4394-969c-109c2a260296" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.676789 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80c5f56-57a3-4778-8382-473cd7678252" containerName="registry-server" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.678212 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.695183 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vmxwf"] Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785108 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60d04a89-a7ce-4865-9b1f-d2779905717c-registry-certificates\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785204 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60d04a89-a7ce-4865-9b1f-d2779905717c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785276 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-registry-tls\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785302 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-bound-sa-token\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785338 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60d04a89-a7ce-4865-9b1f-d2779905717c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785399 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785447 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d04a89-a7ce-4865-9b1f-d2779905717c-trusted-ca\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.785488 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tphwc\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-kube-api-access-tphwc\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.827569 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.854598 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8xkxg" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887257 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60d04a89-a7ce-4865-9b1f-d2779905717c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887350 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-registry-tls\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887380 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-bound-sa-token\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887411 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60d04a89-a7ce-4865-9b1f-d2779905717c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887438 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d04a89-a7ce-4865-9b1f-d2779905717c-trusted-ca\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887462 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tphwc\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-kube-api-access-tphwc\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887482 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60d04a89-a7ce-4865-9b1f-d2779905717c-registry-certificates\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.887759 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60d04a89-a7ce-4865-9b1f-d2779905717c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.889018 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60d04a89-a7ce-4865-9b1f-d2779905717c-registry-certificates\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.891343 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60d04a89-a7ce-4865-9b1f-d2779905717c-trusted-ca\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.924035 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60d04a89-a7ce-4865-9b1f-d2779905717c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.928856 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-registry-tls\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.944780 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tphwc\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-kube-api-access-tphwc\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.949094 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60d04a89-a7ce-4865-9b1f-d2779905717c-bound-sa-token\") pod \"image-registry-66df7c8f76-vmxwf\" (UID: \"60d04a89-a7ce-4865-9b1f-d2779905717c\") " pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.950119 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d2716d-f5de-4242-a170-624490092b98" path="/var/lib/kubelet/pods/38d2716d-f5de-4242-a170-624490092b98/volumes" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.950694 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d2e014-a578-4394-969c-109c2a260296" path="/var/lib/kubelet/pods/75d2e014-a578-4394-969c-109c2a260296/volumes" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.951422 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d67205-fea8-475d-b3da-bd4fc55a58c4" path="/var/lib/kubelet/pods/c3d67205-fea8-475d-b3da-bd4fc55a58c4/volumes" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.952554 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8" path="/var/lib/kubelet/pods/e2f6bebd-ceea-4c7a-8b33-4b8c7aa6b1a8/volumes" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.953113 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80c5f56-57a3-4778-8382-473cd7678252" path="/var/lib/kubelet/pods/e80c5f56-57a3-4778-8382-473cd7678252/volumes" Dec 02 18:20:34 crc kubenswrapper[4878]: I1202 18:20:34.996994 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.118450 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qxh8k"] Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.119791 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.121889 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.130987 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxh8k"] Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.191620 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vstdc\" (UniqueName: \"kubernetes.io/projected/169b29b3-0fc5-499b-b05c-83469da6c269-kube-api-access-vstdc\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.191676 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169b29b3-0fc5-499b-b05c-83469da6c269-catalog-content\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.191699 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169b29b3-0fc5-499b-b05c-83469da6c269-utilities\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.293082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169b29b3-0fc5-499b-b05c-83469da6c269-catalog-content\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.293141 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169b29b3-0fc5-499b-b05c-83469da6c269-utilities\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.293295 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vstdc\" (UniqueName: \"kubernetes.io/projected/169b29b3-0fc5-499b-b05c-83469da6c269-kube-api-access-vstdc\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.294055 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/169b29b3-0fc5-499b-b05c-83469da6c269-catalog-content\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.295919 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/169b29b3-0fc5-499b-b05c-83469da6c269-utilities\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.318672 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vstdc\" (UniqueName: \"kubernetes.io/projected/169b29b3-0fc5-499b-b05c-83469da6c269-kube-api-access-vstdc\") pod \"certified-operators-qxh8k\" (UID: \"169b29b3-0fc5-499b-b05c-83469da6c269\") " pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.428069 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vmxwf"] Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.450868 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.864799 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" event={"ID":"60d04a89-a7ce-4865-9b1f-d2779905717c","Type":"ContainerStarted","Data":"53c27604115517f04d4816aa924e50824910a7f901599fdb81f242602a77ea2c"} Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.865300 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.865324 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" event={"ID":"60d04a89-a7ce-4865-9b1f-d2779905717c","Type":"ContainerStarted","Data":"9580a8791bc86a2f1046c63a7806590b30115a220fd28e0d43f06f510d7235dd"} Dec 02 18:20:35 crc kubenswrapper[4878]: I1202 18:20:35.868056 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxh8k"] Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.112387 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" podStartSLOduration=2.112363521 podStartE2EDuration="2.112363521s" podCreationTimestamp="2025-12-02 18:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:20:35.898843525 +0000 UTC m=+345.588462436" watchObservedRunningTime="2025-12-02 18:20:36.112363521 +0000 UTC m=+345.801982412" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.114980 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j4f2d"] Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.116358 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.119664 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.138033 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4f2d"] Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.208645 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651270f6-1566-4359-b11c-561ee744e88f-utilities\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.208720 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2qw\" (UniqueName: \"kubernetes.io/projected/651270f6-1566-4359-b11c-561ee744e88f-kube-api-access-rz2qw\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.208752 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651270f6-1566-4359-b11c-561ee744e88f-catalog-content\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.309661 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651270f6-1566-4359-b11c-561ee744e88f-utilities\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.309776 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2qw\" (UniqueName: \"kubernetes.io/projected/651270f6-1566-4359-b11c-561ee744e88f-kube-api-access-rz2qw\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.309814 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651270f6-1566-4359-b11c-561ee744e88f-catalog-content\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.310384 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/651270f6-1566-4359-b11c-561ee744e88f-utilities\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.310458 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/651270f6-1566-4359-b11c-561ee744e88f-catalog-content\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.330137 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2qw\" (UniqueName: \"kubernetes.io/projected/651270f6-1566-4359-b11c-561ee744e88f-kube-api-access-rz2qw\") pod \"redhat-operators-j4f2d\" (UID: \"651270f6-1566-4359-b11c-561ee744e88f\") " pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.457609 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.882096 4878 generic.go:334] "Generic (PLEG): container finished" podID="169b29b3-0fc5-499b-b05c-83469da6c269" containerID="a745b163e0975d1281679da37e1c812e6ca50732bbb34d26d131f56959f85647" exitCode=0 Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.882189 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxh8k" event={"ID":"169b29b3-0fc5-499b-b05c-83469da6c269","Type":"ContainerDied","Data":"a745b163e0975d1281679da37e1c812e6ca50732bbb34d26d131f56959f85647"} Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.882707 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxh8k" event={"ID":"169b29b3-0fc5-499b-b05c-83469da6c269","Type":"ContainerStarted","Data":"c2fbe4b2ecaf0e5be153529da17e46a74905a990c5c56e3323b74ca028a80a87"} Dec 02 18:20:36 crc kubenswrapper[4878]: I1202 18:20:36.905035 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4f2d"] Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.520887 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-46kr4"] Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.525110 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.530029 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.548209 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-46kr4"] Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.627352 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcdn\" (UniqueName: \"kubernetes.io/projected/2b1d265e-ec12-4052-83ec-5db3bd68b034-kube-api-access-jfcdn\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.627400 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d265e-ec12-4052-83ec-5db3bd68b034-catalog-content\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.627429 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d265e-ec12-4052-83ec-5db3bd68b034-utilities\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.728294 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcdn\" (UniqueName: \"kubernetes.io/projected/2b1d265e-ec12-4052-83ec-5db3bd68b034-kube-api-access-jfcdn\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.728354 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d265e-ec12-4052-83ec-5db3bd68b034-catalog-content\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.728384 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d265e-ec12-4052-83ec-5db3bd68b034-utilities\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.728988 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d265e-ec12-4052-83ec-5db3bd68b034-utilities\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.729390 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d265e-ec12-4052-83ec-5db3bd68b034-catalog-content\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.755528 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcdn\" (UniqueName: \"kubernetes.io/projected/2b1d265e-ec12-4052-83ec-5db3bd68b034-kube-api-access-jfcdn\") pod \"community-operators-46kr4\" (UID: \"2b1d265e-ec12-4052-83ec-5db3bd68b034\") " pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:37 crc kubenswrapper[4878]: I1202 18:20:37.844324 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.011140 4878 generic.go:334] "Generic (PLEG): container finished" podID="651270f6-1566-4359-b11c-561ee744e88f" containerID="44f7fa6a3fa30696547e0dedac64bd978bd808ac164985318a25e7136325e573" exitCode=0 Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.011212 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4f2d" event={"ID":"651270f6-1566-4359-b11c-561ee744e88f","Type":"ContainerDied","Data":"44f7fa6a3fa30696547e0dedac64bd978bd808ac164985318a25e7136325e573"} Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.011807 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4f2d" event={"ID":"651270f6-1566-4359-b11c-561ee744e88f","Type":"ContainerStarted","Data":"8184e7ca7ff0cd002ee51ad791ea4ea2db4e35489414c3a0d717c3a4a4a9fbdc"} Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.511812 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-46kr4"] Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.530583 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pq9p9"] Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.532071 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: W1202 18:20:38.535334 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b1d265e_ec12_4052_83ec_5db3bd68b034.slice/crio-dc88c7f96d6c9cd0edc6f707913e2ad1cd7546b40e429329f14e1a23bd89a75c WatchSource:0}: Error finding container dc88c7f96d6c9cd0edc6f707913e2ad1cd7546b40e429329f14e1a23bd89a75c: Status 404 returned error can't find the container with id dc88c7f96d6c9cd0edc6f707913e2ad1cd7546b40e429329f14e1a23bd89a75c Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.541409 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.551116 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pq9p9"] Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.644513 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55a4df1-0e5a-43aa-9de4-e216aeb40407-utilities\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.644582 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55a4df1-0e5a-43aa-9de4-e216aeb40407-catalog-content\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.644605 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trtvp\" (UniqueName: \"kubernetes.io/projected/c55a4df1-0e5a-43aa-9de4-e216aeb40407-kube-api-access-trtvp\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.745775 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55a4df1-0e5a-43aa-9de4-e216aeb40407-utilities\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.745879 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55a4df1-0e5a-43aa-9de4-e216aeb40407-catalog-content\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.745915 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trtvp\" (UniqueName: \"kubernetes.io/projected/c55a4df1-0e5a-43aa-9de4-e216aeb40407-kube-api-access-trtvp\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.746501 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55a4df1-0e5a-43aa-9de4-e216aeb40407-utilities\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.746548 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55a4df1-0e5a-43aa-9de4-e216aeb40407-catalog-content\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.771437 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trtvp\" (UniqueName: \"kubernetes.io/projected/c55a4df1-0e5a-43aa-9de4-e216aeb40407-kube-api-access-trtvp\") pod \"redhat-marketplace-pq9p9\" (UID: \"c55a4df1-0e5a-43aa-9de4-e216aeb40407\") " pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:38 crc kubenswrapper[4878]: I1202 18:20:38.877678 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:39 crc kubenswrapper[4878]: I1202 18:20:39.039028 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4f2d" event={"ID":"651270f6-1566-4359-b11c-561ee744e88f","Type":"ContainerStarted","Data":"4b3248b683acd2bbd388fc477a1b53a274a0d48b558e80cf8f3876db30905aa3"} Dec 02 18:20:39 crc kubenswrapper[4878]: I1202 18:20:39.044123 4878 generic.go:334] "Generic (PLEG): container finished" podID="2b1d265e-ec12-4052-83ec-5db3bd68b034" containerID="d1cc2b94c48c8ef29919046dadb90514e1d5a8a4b212b4fc06c5db0f994058fa" exitCode=0 Dec 02 18:20:39 crc kubenswrapper[4878]: I1202 18:20:39.044194 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46kr4" event={"ID":"2b1d265e-ec12-4052-83ec-5db3bd68b034","Type":"ContainerDied","Data":"d1cc2b94c48c8ef29919046dadb90514e1d5a8a4b212b4fc06c5db0f994058fa"} Dec 02 18:20:39 crc kubenswrapper[4878]: I1202 18:20:39.044221 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46kr4" event={"ID":"2b1d265e-ec12-4052-83ec-5db3bd68b034","Type":"ContainerStarted","Data":"dc88c7f96d6c9cd0edc6f707913e2ad1cd7546b40e429329f14e1a23bd89a75c"} Dec 02 18:20:39 crc kubenswrapper[4878]: I1202 18:20:39.070118 4878 generic.go:334] "Generic (PLEG): container finished" podID="169b29b3-0fc5-499b-b05c-83469da6c269" containerID="71d11aaa10734391fb2edbb33a5d2e165f63ceb586131f78d05b01d142317295" exitCode=0 Dec 02 18:20:39 crc kubenswrapper[4878]: I1202 18:20:39.070184 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxh8k" event={"ID":"169b29b3-0fc5-499b-b05c-83469da6c269","Type":"ContainerDied","Data":"71d11aaa10734391fb2edbb33a5d2e165f63ceb586131f78d05b01d142317295"} Dec 02 18:20:39 crc kubenswrapper[4878]: I1202 18:20:39.300799 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pq9p9"] Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.079785 4878 generic.go:334] "Generic (PLEG): container finished" podID="c55a4df1-0e5a-43aa-9de4-e216aeb40407" containerID="819080c3b1cf8d4809cf026d64c5f08921cb108ab5a8a6624d7e0a3bb78f3165" exitCode=0 Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.079868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pq9p9" event={"ID":"c55a4df1-0e5a-43aa-9de4-e216aeb40407","Type":"ContainerDied","Data":"819080c3b1cf8d4809cf026d64c5f08921cb108ab5a8a6624d7e0a3bb78f3165"} Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.080535 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pq9p9" event={"ID":"c55a4df1-0e5a-43aa-9de4-e216aeb40407","Type":"ContainerStarted","Data":"992c9996a9b5a03f7a805af9d2f064cab35e93f45612ebff0a36c655c6574693"} Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.083456 4878 generic.go:334] "Generic (PLEG): container finished" podID="651270f6-1566-4359-b11c-561ee744e88f" containerID="4b3248b683acd2bbd388fc477a1b53a274a0d48b558e80cf8f3876db30905aa3" exitCode=0 Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.083519 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4f2d" event={"ID":"651270f6-1566-4359-b11c-561ee744e88f","Type":"ContainerDied","Data":"4b3248b683acd2bbd388fc477a1b53a274a0d48b558e80cf8f3876db30905aa3"} Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.090219 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxh8k" event={"ID":"169b29b3-0fc5-499b-b05c-83469da6c269","Type":"ContainerStarted","Data":"9472490ea61a64b68a68fe5cf86531435d87a967edf53440f52cd93374a9310f"} Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.093874 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46kr4" event={"ID":"2b1d265e-ec12-4052-83ec-5db3bd68b034","Type":"ContainerStarted","Data":"fff5e745fbf2580013f18394bf64b1301dabbe70be800bd9403a8ccf07cb8051"} Dec 02 18:20:40 crc kubenswrapper[4878]: I1202 18:20:40.125342 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qxh8k" podStartSLOduration=2.557461047 podStartE2EDuration="5.125323723s" podCreationTimestamp="2025-12-02 18:20:35 +0000 UTC" firstStartedPulling="2025-12-02 18:20:36.88488818 +0000 UTC m=+346.574507061" lastFinishedPulling="2025-12-02 18:20:39.452750836 +0000 UTC m=+349.142369737" observedRunningTime="2025-12-02 18:20:40.122844402 +0000 UTC m=+349.812463303" watchObservedRunningTime="2025-12-02 18:20:40.125323723 +0000 UTC m=+349.814942604" Dec 02 18:20:41 crc kubenswrapper[4878]: I1202 18:20:41.101754 4878 generic.go:334] "Generic (PLEG): container finished" podID="c55a4df1-0e5a-43aa-9de4-e216aeb40407" containerID="0a1d433c32427b9dafc91fdafbe2dfc3456890676d0331bb2f096513b0ce2bb8" exitCode=0 Dec 02 18:20:41 crc kubenswrapper[4878]: I1202 18:20:41.101868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pq9p9" event={"ID":"c55a4df1-0e5a-43aa-9de4-e216aeb40407","Type":"ContainerDied","Data":"0a1d433c32427b9dafc91fdafbe2dfc3456890676d0331bb2f096513b0ce2bb8"} Dec 02 18:20:41 crc kubenswrapper[4878]: I1202 18:20:41.106370 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4f2d" event={"ID":"651270f6-1566-4359-b11c-561ee744e88f","Type":"ContainerStarted","Data":"cb9e55d7e79bd7eb57074e2f7d2c4708ccbfc21eb89f589be543325d23157b0e"} Dec 02 18:20:41 crc kubenswrapper[4878]: I1202 18:20:41.110476 4878 generic.go:334] "Generic (PLEG): container finished" podID="2b1d265e-ec12-4052-83ec-5db3bd68b034" containerID="fff5e745fbf2580013f18394bf64b1301dabbe70be800bd9403a8ccf07cb8051" exitCode=0 Dec 02 18:20:41 crc kubenswrapper[4878]: I1202 18:20:41.110596 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46kr4" event={"ID":"2b1d265e-ec12-4052-83ec-5db3bd68b034","Type":"ContainerDied","Data":"fff5e745fbf2580013f18394bf64b1301dabbe70be800bd9403a8ccf07cb8051"} Dec 02 18:20:41 crc kubenswrapper[4878]: I1202 18:20:41.190633 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j4f2d" podStartSLOduration=2.768642796 podStartE2EDuration="5.190614102s" podCreationTimestamp="2025-12-02 18:20:36 +0000 UTC" firstStartedPulling="2025-12-02 18:20:38.020537497 +0000 UTC m=+347.710156378" lastFinishedPulling="2025-12-02 18:20:40.442508803 +0000 UTC m=+350.132127684" observedRunningTime="2025-12-02 18:20:41.187509633 +0000 UTC m=+350.877128514" watchObservedRunningTime="2025-12-02 18:20:41.190614102 +0000 UTC m=+350.880232993" Dec 02 18:20:42 crc kubenswrapper[4878]: I1202 18:20:42.121065 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46kr4" event={"ID":"2b1d265e-ec12-4052-83ec-5db3bd68b034","Type":"ContainerStarted","Data":"7c2f00a26b293c215eea285396109bbcf31ef5c41cde71c270909b52cfdf61fa"} Dec 02 18:20:42 crc kubenswrapper[4878]: I1202 18:20:42.129400 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pq9p9" event={"ID":"c55a4df1-0e5a-43aa-9de4-e216aeb40407","Type":"ContainerStarted","Data":"a3e0e0886f48a3ccc53ad14842891d6aa8eb51c94fa7a941115e7b4823c800ec"} Dec 02 18:20:42 crc kubenswrapper[4878]: I1202 18:20:42.147576 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-46kr4" podStartSLOduration=2.396036042 podStartE2EDuration="5.147560278s" podCreationTimestamp="2025-12-02 18:20:37 +0000 UTC" firstStartedPulling="2025-12-02 18:20:39.053866163 +0000 UTC m=+348.743485044" lastFinishedPulling="2025-12-02 18:20:41.805390399 +0000 UTC m=+351.495009280" observedRunningTime="2025-12-02 18:20:42.145755009 +0000 UTC m=+351.835373890" watchObservedRunningTime="2025-12-02 18:20:42.147560278 +0000 UTC m=+351.837179159" Dec 02 18:20:42 crc kubenswrapper[4878]: I1202 18:20:42.169814 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pq9p9" podStartSLOduration=2.760869891 podStartE2EDuration="4.169790577s" podCreationTimestamp="2025-12-02 18:20:38 +0000 UTC" firstStartedPulling="2025-12-02 18:20:40.082508127 +0000 UTC m=+349.772127008" lastFinishedPulling="2025-12-02 18:20:41.491428813 +0000 UTC m=+351.181047694" observedRunningTime="2025-12-02 18:20:42.167360139 +0000 UTC m=+351.856979020" watchObservedRunningTime="2025-12-02 18:20:42.169790577 +0000 UTC m=+351.859409458" Dec 02 18:20:45 crc kubenswrapper[4878]: I1202 18:20:45.452111 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:45 crc kubenswrapper[4878]: I1202 18:20:45.452526 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:45 crc kubenswrapper[4878]: I1202 18:20:45.510057 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:46 crc kubenswrapper[4878]: I1202 18:20:46.200653 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qxh8k" Dec 02 18:20:46 crc kubenswrapper[4878]: I1202 18:20:46.458717 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:46 crc kubenswrapper[4878]: I1202 18:20:46.459162 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:46 crc kubenswrapper[4878]: I1202 18:20:46.505655 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:47 crc kubenswrapper[4878]: I1202 18:20:47.203100 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j4f2d" Dec 02 18:20:47 crc kubenswrapper[4878]: I1202 18:20:47.845142 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:47 crc kubenswrapper[4878]: I1202 18:20:47.845834 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:47 crc kubenswrapper[4878]: I1202 18:20:47.910371 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:48 crc kubenswrapper[4878]: I1202 18:20:48.220231 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-46kr4" Dec 02 18:20:48 crc kubenswrapper[4878]: I1202 18:20:48.878674 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:48 crc kubenswrapper[4878]: I1202 18:20:48.878734 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:48 crc kubenswrapper[4878]: I1202 18:20:48.932801 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:49 crc kubenswrapper[4878]: I1202 18:20:49.218207 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pq9p9" Dec 02 18:20:53 crc kubenswrapper[4878]: I1202 18:20:53.742424 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:20:53 crc kubenswrapper[4878]: I1202 18:20:53.743272 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:20:55 crc kubenswrapper[4878]: I1202 18:20:55.003081 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vmxwf" Dec 02 18:20:55 crc kubenswrapper[4878]: I1202 18:20:55.083736 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6hxvm"] Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.228766 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf"] Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.232143 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.234654 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.236018 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.236583 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.236800 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.236717 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.238946 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf"] Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.351965 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.352067 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7zx\" (UniqueName: \"kubernetes.io/projected/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-kube-api-access-5z7zx\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.352224 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.453647 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.453793 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7zx\" (UniqueName: \"kubernetes.io/projected/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-kube-api-access-5z7zx\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.453863 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.456065 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.470962 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.476753 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7zx\" (UniqueName: \"kubernetes.io/projected/d4de395b-45e0-4f6f-bbce-8f14bbe50a09-kube-api-access-5z7zx\") pod \"cluster-monitoring-operator-6d5b84845-wddqf\" (UID: \"d4de395b-45e0-4f6f-bbce-8f14bbe50a09\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:04 crc kubenswrapper[4878]: I1202 18:21:04.550834 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" Dec 02 18:21:05 crc kubenswrapper[4878]: I1202 18:21:05.028061 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf"] Dec 02 18:21:05 crc kubenswrapper[4878]: W1202 18:21:05.029021 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4de395b_45e0_4f6f_bbce_8f14bbe50a09.slice/crio-5417e162d0220b663ee589035549b60a7039094c38623257ea72d4ace7dca76b WatchSource:0}: Error finding container 5417e162d0220b663ee589035549b60a7039094c38623257ea72d4ace7dca76b: Status 404 returned error can't find the container with id 5417e162d0220b663ee589035549b60a7039094c38623257ea72d4ace7dca76b Dec 02 18:21:05 crc kubenswrapper[4878]: I1202 18:21:05.265805 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" event={"ID":"d4de395b-45e0-4f6f-bbce-8f14bbe50a09","Type":"ContainerStarted","Data":"5417e162d0220b663ee589035549b60a7039094c38623257ea72d4ace7dca76b"} Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.282104 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" event={"ID":"d4de395b-45e0-4f6f-bbce-8f14bbe50a09","Type":"ContainerStarted","Data":"49cdac673e75712fe80c4b812c98eeb4b4157086044245fd5aa2e9848ade7d73"} Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.305506 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wddqf" podStartSLOduration=1.3493044699999999 podStartE2EDuration="3.305482418s" podCreationTimestamp="2025-12-02 18:21:04 +0000 UTC" firstStartedPulling="2025-12-02 18:21:05.033006708 +0000 UTC m=+374.722625629" lastFinishedPulling="2025-12-02 18:21:06.989184696 +0000 UTC m=+376.678803577" observedRunningTime="2025-12-02 18:21:07.303547616 +0000 UTC m=+376.993166517" watchObservedRunningTime="2025-12-02 18:21:07.305482418 +0000 UTC m=+376.995101309" Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.661247 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g"] Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.661937 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.664424 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.664961 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-bnmhk" Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.672306 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g"] Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.711792 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1ba69f55-9e22-4c32-bbec-a1caa78f5189-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tvm5g\" (UID: \"1ba69f55-9e22-4c32-bbec-a1caa78f5189\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.813221 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1ba69f55-9e22-4c32-bbec-a1caa78f5189-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tvm5g\" (UID: \"1ba69f55-9e22-4c32-bbec-a1caa78f5189\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.821160 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1ba69f55-9e22-4c32-bbec-a1caa78f5189-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-tvm5g\" (UID: \"1ba69f55-9e22-4c32-bbec-a1caa78f5189\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" Dec 02 18:21:07 crc kubenswrapper[4878]: I1202 18:21:07.975730 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" Dec 02 18:21:08 crc kubenswrapper[4878]: I1202 18:21:08.388886 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g"] Dec 02 18:21:09 crc kubenswrapper[4878]: I1202 18:21:09.295139 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" event={"ID":"1ba69f55-9e22-4c32-bbec-a1caa78f5189","Type":"ContainerStarted","Data":"f10e31acbe47e88114a00081a4ea274acda7746cbe3d2235e65b853571488de1"} Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.313322 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" event={"ID":"1ba69f55-9e22-4c32-bbec-a1caa78f5189","Type":"ContainerStarted","Data":"29f5aad6240d78cbc518373125f3806ac544bd2b777366b1f3246d883584066b"} Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.313953 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.320526 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.336168 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-tvm5g" podStartSLOduration=1.82522983 podStartE2EDuration="4.336132053s" podCreationTimestamp="2025-12-02 18:21:07 +0000 UTC" firstStartedPulling="2025-12-02 18:21:08.392914725 +0000 UTC m=+378.082533606" lastFinishedPulling="2025-12-02 18:21:10.903816938 +0000 UTC m=+380.593435829" observedRunningTime="2025-12-02 18:21:11.333636661 +0000 UTC m=+381.023255562" watchObservedRunningTime="2025-12-02 18:21:11.336132053 +0000 UTC m=+381.025750974" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.731412 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-gtzvb"] Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.732365 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.736728 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-vpnkp" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.737097 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.737500 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.738965 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.748810 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-gtzvb"] Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.807395 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.807448 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.807492 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-metrics-client-ca\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.807526 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2hsh\" (UniqueName: \"kubernetes.io/projected/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-kube-api-access-s2hsh\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.909672 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-metrics-client-ca\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.909758 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2hsh\" (UniqueName: \"kubernetes.io/projected/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-kube-api-access-s2hsh\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.909852 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.909890 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.911205 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-metrics-client-ca\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.918108 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.918225 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:11 crc kubenswrapper[4878]: I1202 18:21:11.936064 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2hsh\" (UniqueName: \"kubernetes.io/projected/4a08bb2f-5d7d-470f-b01a-3f6c521573d3-kube-api-access-s2hsh\") pod \"prometheus-operator-db54df47d-gtzvb\" (UID: \"4a08bb2f-5d7d-470f-b01a-3f6c521573d3\") " pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:12 crc kubenswrapper[4878]: I1202 18:21:12.093692 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" Dec 02 18:21:12 crc kubenswrapper[4878]: I1202 18:21:12.534444 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-gtzvb"] Dec 02 18:21:12 crc kubenswrapper[4878]: W1202 18:21:12.543357 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a08bb2f_5d7d_470f_b01a_3f6c521573d3.slice/crio-e0eb632a79b8c2bb8cd71a190cb93ce88c95e3e33c3944fd82e543a51dc277bc WatchSource:0}: Error finding container e0eb632a79b8c2bb8cd71a190cb93ce88c95e3e33c3944fd82e543a51dc277bc: Status 404 returned error can't find the container with id e0eb632a79b8c2bb8cd71a190cb93ce88c95e3e33c3944fd82e543a51dc277bc Dec 02 18:21:13 crc kubenswrapper[4878]: I1202 18:21:13.327987 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" event={"ID":"4a08bb2f-5d7d-470f-b01a-3f6c521573d3","Type":"ContainerStarted","Data":"e0eb632a79b8c2bb8cd71a190cb93ce88c95e3e33c3944fd82e543a51dc277bc"} Dec 02 18:21:15 crc kubenswrapper[4878]: I1202 18:21:15.341975 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" event={"ID":"4a08bb2f-5d7d-470f-b01a-3f6c521573d3","Type":"ContainerStarted","Data":"f3bcc01c8a45f733e0571b53de54c977383abcde986025b9d7724d7d3cb67c64"} Dec 02 18:21:17 crc kubenswrapper[4878]: I1202 18:21:17.359138 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" event={"ID":"4a08bb2f-5d7d-470f-b01a-3f6c521573d3","Type":"ContainerStarted","Data":"a6071b162785650561540e03969be0c99179ea18965ef130b3e7a80203dc49b2"} Dec 02 18:21:17 crc kubenswrapper[4878]: I1202 18:21:17.393802 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-gtzvb" podStartSLOduration=4.091113479 podStartE2EDuration="6.393774015s" podCreationTimestamp="2025-12-02 18:21:11 +0000 UTC" firstStartedPulling="2025-12-02 18:21:12.54818878 +0000 UTC m=+382.237807661" lastFinishedPulling="2025-12-02 18:21:14.850849286 +0000 UTC m=+384.540468197" observedRunningTime="2025-12-02 18:21:17.387668148 +0000 UTC m=+387.077287079" watchObservedRunningTime="2025-12-02 18:21:17.393774015 +0000 UTC m=+387.083392906" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.112364 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-hqt75"] Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.114669 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.117024 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.117218 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-dkvtw" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.118447 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.134010 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-hqt75"] Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.134789 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/568128a9-6c26-450b-acf5-260cf1f9ec57-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.134969 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/568128a9-6c26-450b-acf5-260cf1f9ec57-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.135099 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/568128a9-6c26-450b-acf5-260cf1f9ec57-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.135262 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hzb\" (UniqueName: \"kubernetes.io/projected/568128a9-6c26-450b-acf5-260cf1f9ec57-kube-api-access-t7hzb\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.137513 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4"] Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.138696 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.142442 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-crh76" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.142955 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.143252 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.143274 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.151180 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6zp7j"] Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.152544 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.156910 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-ndg2v" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.157081 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.157193 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.173946 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4"] Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.236902 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83601880-99db-414c-a8c9-08211fd9e54d-metrics-client-ca\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.236974 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-wtmp\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237005 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237036 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237065 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-sys\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237082 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-tls\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237164 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/568128a9-6c26-450b-acf5-260cf1f9ec57-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237185 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/568128a9-6c26-450b-acf5-260cf1f9ec57-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237209 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237228 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6cw8\" (UniqueName: \"kubernetes.io/projected/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-api-access-j6cw8\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237266 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9ct\" (UniqueName: \"kubernetes.io/projected/83601880-99db-414c-a8c9-08211fd9e54d-kube-api-access-sn9ct\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237295 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/568128a9-6c26-450b-acf5-260cf1f9ec57-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237320 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-textfile\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237344 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hzb\" (UniqueName: \"kubernetes.io/projected/568128a9-6c26-450b-acf5-260cf1f9ec57-kube-api-access-t7hzb\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237365 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-root\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237386 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237410 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.237435 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.239218 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/568128a9-6c26-450b-acf5-260cf1f9ec57-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.245833 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/568128a9-6c26-450b-acf5-260cf1f9ec57-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.262363 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/568128a9-6c26-450b-acf5-260cf1f9ec57-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.273967 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hzb\" (UniqueName: \"kubernetes.io/projected/568128a9-6c26-450b-acf5-260cf1f9ec57-kube-api-access-t7hzb\") pod \"openshift-state-metrics-566fddb674-hqt75\" (UID: \"568128a9-6c26-450b-acf5-260cf1f9ec57\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.338348 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-wtmp\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.338730 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.338853 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.339774 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-sys\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.340033 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-tls\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.340939 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341049 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6cw8\" (UniqueName: \"kubernetes.io/projected/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-api-access-j6cw8\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341134 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9ct\" (UniqueName: \"kubernetes.io/projected/83601880-99db-414c-a8c9-08211fd9e54d-kube-api-access-sn9ct\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341290 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-textfile\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341429 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-root\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341523 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341617 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341698 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.341804 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83601880-99db-414c-a8c9-08211fd9e54d-metrics-client-ca\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.342494 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/83601880-99db-414c-a8c9-08211fd9e54d-metrics-client-ca\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.339899 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-sys\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.342936 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.343050 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-root\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.340153 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.338663 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-wtmp\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.343473 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-textfile\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.344270 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.344709 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.354850 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.354855 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.354919 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/83601880-99db-414c-a8c9-08211fd9e54d-node-exporter-tls\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.378959 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9ct\" (UniqueName: \"kubernetes.io/projected/83601880-99db-414c-a8c9-08211fd9e54d-kube-api-access-sn9ct\") pod \"node-exporter-6zp7j\" (UID: \"83601880-99db-414c-a8c9-08211fd9e54d\") " pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.378991 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6cw8\" (UniqueName: \"kubernetes.io/projected/1560ebec-22c0-4ea7-81f9-3fd7c9b9629d-kube-api-access-j6cw8\") pod \"kube-state-metrics-777cb5bd5d-xh2w4\" (UID: \"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.433137 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.498571 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.505828 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6zp7j" Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.872130 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-hqt75"] Dec 02 18:21:19 crc kubenswrapper[4878]: W1202 18:21:19.879025 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod568128a9_6c26_450b_acf5_260cf1f9ec57.slice/crio-84df49cfd44be0d06bd06aba7c658e6ab3beb2a77d3988d55247ff26f040ea29 WatchSource:0}: Error finding container 84df49cfd44be0d06bd06aba7c658e6ab3beb2a77d3988d55247ff26f040ea29: Status 404 returned error can't find the container with id 84df49cfd44be0d06bd06aba7c658e6ab3beb2a77d3988d55247ff26f040ea29 Dec 02 18:21:19 crc kubenswrapper[4878]: I1202 18:21:19.978716 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4"] Dec 02 18:21:19 crc kubenswrapper[4878]: W1202 18:21:19.982784 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1560ebec_22c0_4ea7_81f9_3fd7c9b9629d.slice/crio-437d6385509d528e3418ed06f34d6e20647a96a4a06fdb2605a35f53cea93815 WatchSource:0}: Error finding container 437d6385509d528e3418ed06f34d6e20647a96a4a06fdb2605a35f53cea93815: Status 404 returned error can't find the container with id 437d6385509d528e3418ed06f34d6e20647a96a4a06fdb2605a35f53cea93815 Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.142762 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" podUID="763bb008-97f2-4e90-965c-5a7537ff0a57" containerName="registry" containerID="cri-o://0603c161c7fddd3c29dd6a0837fcba0d259686aec3939faf5f457423b61fcd84" gracePeriod=30 Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.199624 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.202102 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.206450 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.206623 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.206712 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.207674 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.208183 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-5p9t9" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.209267 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.209512 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.209622 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.215945 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.217488 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.324788 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxxsc\" (UniqueName: \"kubernetes.io/projected/87e44891-f65f-4630-bd14-a6d663c91d71-kube-api-access-qxxsc\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.324848 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-config-volume\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.324873 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.324899 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.324929 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e44891-f65f-4630-bd14-a6d663c91d71-tls-assets\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.324963 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.324998 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e44891-f65f-4630-bd14-a6d663c91d71-config-out\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.325026 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e44891-f65f-4630-bd14-a6d663c91d71-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.325050 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.325083 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-web-config\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.325105 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e44891-f65f-4630-bd14-a6d663c91d71-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.325138 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/87e44891-f65f-4630-bd14-a6d663c91d71-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.383803 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" event={"ID":"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d","Type":"ContainerStarted","Data":"437d6385509d528e3418ed06f34d6e20647a96a4a06fdb2605a35f53cea93815"} Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.388418 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" event={"ID":"568128a9-6c26-450b-acf5-260cf1f9ec57","Type":"ContainerStarted","Data":"84df49cfd44be0d06bd06aba7c658e6ab3beb2a77d3988d55247ff26f040ea29"} Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.390457 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zp7j" event={"ID":"83601880-99db-414c-a8c9-08211fd9e54d","Type":"ContainerStarted","Data":"fc001ccaf6e1b5a349b3343c722e00d9e69defa1cd3e05b4d5f1ecc19fd2a81b"} Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426505 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e44891-f65f-4630-bd14-a6d663c91d71-tls-assets\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426586 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426619 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e44891-f65f-4630-bd14-a6d663c91d71-config-out\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426642 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e44891-f65f-4630-bd14-a6d663c91d71-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426662 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426684 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-web-config\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426719 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e44891-f65f-4630-bd14-a6d663c91d71-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426757 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/87e44891-f65f-4630-bd14-a6d663c91d71-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426790 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxxsc\" (UniqueName: \"kubernetes.io/projected/87e44891-f65f-4630-bd14-a6d663c91d71-kube-api-access-qxxsc\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426815 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-config-volume\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426835 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.426863 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.430425 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e44891-f65f-4630-bd14-a6d663c91d71-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.430759 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/87e44891-f65f-4630-bd14-a6d663c91d71-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.431670 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e44891-f65f-4630-bd14-a6d663c91d71-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.436766 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.436782 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-config-volume\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.437041 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.441166 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e44891-f65f-4630-bd14-a6d663c91d71-config-out\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.441928 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.444627 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e44891-f65f-4630-bd14-a6d663c91d71-tls-assets\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.447325 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxxsc\" (UniqueName: \"kubernetes.io/projected/87e44891-f65f-4630-bd14-a6d663c91d71-kube-api-access-qxxsc\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.447995 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-web-config\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.449952 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/87e44891-f65f-4630-bd14-a6d663c91d71-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"87e44891-f65f-4630-bd14-a6d663c91d71\") " pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.519885 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 02 18:21:20 crc kubenswrapper[4878]: I1202 18:21:20.983504 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.204108 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d55987f4-pmp97"] Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.206605 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.209496 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.209795 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-8sk5n" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.209955 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.209981 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.210120 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6pu8ktc17d19v" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.210260 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.210276 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.219494 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d55987f4-pmp97"] Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.342126 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea96e843-9776-4b36-84c5-d4b187c87720-metrics-client-ca\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.342407 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.342550 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.342649 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-tls\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.342734 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7jr\" (UniqueName: \"kubernetes.io/projected/ea96e843-9776-4b36-84c5-d4b187c87720-kube-api-access-7t7jr\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.342839 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-grpc-tls\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.342964 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.343141 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.398316 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" event={"ID":"568128a9-6c26-450b-acf5-260cf1f9ec57","Type":"ContainerStarted","Data":"dae0698d6f29f8a2db6351a27152dd4cd8133cdef5f29b9816fa42da15784b84"} Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.399606 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerStarted","Data":"340052e48e5147fc87732ceadad13ee1a777bbc59fd8c7c78fd8bbdb0c87be64"} Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.401940 4878 generic.go:334] "Generic (PLEG): container finished" podID="763bb008-97f2-4e90-965c-5a7537ff0a57" containerID="0603c161c7fddd3c29dd6a0837fcba0d259686aec3939faf5f457423b61fcd84" exitCode=0 Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.401995 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" event={"ID":"763bb008-97f2-4e90-965c-5a7537ff0a57","Type":"ContainerDied","Data":"0603c161c7fddd3c29dd6a0837fcba0d259686aec3939faf5f457423b61fcd84"} Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.443866 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea96e843-9776-4b36-84c5-d4b187c87720-metrics-client-ca\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.443928 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.443976 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.444003 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-tls\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.444023 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7jr\" (UniqueName: \"kubernetes.io/projected/ea96e843-9776-4b36-84c5-d4b187c87720-kube-api-access-7t7jr\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.444052 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-grpc-tls\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.444073 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.444096 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.445118 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea96e843-9776-4b36-84c5-d4b187c87720-metrics-client-ca\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.454166 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-tls\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.454167 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.454929 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.455259 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.455585 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-grpc-tls\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.458269 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea96e843-9776-4b36-84c5-d4b187c87720-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.481341 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7jr\" (UniqueName: \"kubernetes.io/projected/ea96e843-9776-4b36-84c5-d4b187c87720-kube-api-access-7t7jr\") pod \"thanos-querier-5d55987f4-pmp97\" (UID: \"ea96e843-9776-4b36-84c5-d4b187c87720\") " pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.527953 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:21 crc kubenswrapper[4878]: I1202 18:21:21.809588 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d55987f4-pmp97"] Dec 02 18:21:21 crc kubenswrapper[4878]: W1202 18:21:21.819175 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea96e843_9776_4b36_84c5_d4b187c87720.slice/crio-c1503afb1cb0c905867ec76d1be72b0834267e409283ae4f707792261d865994 WatchSource:0}: Error finding container c1503afb1cb0c905867ec76d1be72b0834267e409283ae4f707792261d865994: Status 404 returned error can't find the container with id c1503afb1cb0c905867ec76d1be72b0834267e409283ae4f707792261d865994 Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.149965 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.256759 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/763bb008-97f2-4e90-965c-5a7537ff0a57-installation-pull-secrets\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.256815 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgcqx\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-kube-api-access-jgcqx\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.256956 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.256991 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-trusted-ca\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.257050 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-certificates\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.257087 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/763bb008-97f2-4e90-965c-5a7537ff0a57-ca-trust-extracted\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.257121 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-tls\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.257149 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-bound-sa-token\") pod \"763bb008-97f2-4e90-965c-5a7537ff0a57\" (UID: \"763bb008-97f2-4e90-965c-5a7537ff0a57\") " Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.258884 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.259018 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.263380 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/763bb008-97f2-4e90-965c-5a7537ff0a57-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.264464 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-kube-api-access-jgcqx" (OuterVolumeSpecName: "kube-api-access-jgcqx") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "kube-api-access-jgcqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.273032 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.273206 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.293786 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763bb008-97f2-4e90-965c-5a7537ff0a57-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.316871 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "763bb008-97f2-4e90-965c-5a7537ff0a57" (UID: "763bb008-97f2-4e90-965c-5a7537ff0a57"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.357706 4878 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/763bb008-97f2-4e90-965c-5a7537ff0a57-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.357753 4878 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.357762 4878 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.357772 4878 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/763bb008-97f2-4e90-965c-5a7537ff0a57-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.357784 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgcqx\" (UniqueName: \"kubernetes.io/projected/763bb008-97f2-4e90-965c-5a7537ff0a57-kube-api-access-jgcqx\") on node \"crc\" DevicePath \"\"" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.357792 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.357801 4878 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/763bb008-97f2-4e90-965c-5a7537ff0a57-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.411216 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" event={"ID":"568128a9-6c26-450b-acf5-260cf1f9ec57","Type":"ContainerStarted","Data":"b8a6f143481b9a484be84f7b0b919114e50c5aae7468a9b4b2a585973277777f"} Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.413616 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" event={"ID":"763bb008-97f2-4e90-965c-5a7537ff0a57","Type":"ContainerDied","Data":"a85ac8421623cf1d1a1db5e7703967e79e571b64778d23258fe46e657d6e5913"} Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.413652 4878 scope.go:117] "RemoveContainer" containerID="0603c161c7fddd3c29dd6a0837fcba0d259686aec3939faf5f457423b61fcd84" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.413708 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6hxvm" Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.414597 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" event={"ID":"ea96e843-9776-4b36-84c5-d4b187c87720","Type":"ContainerStarted","Data":"c1503afb1cb0c905867ec76d1be72b0834267e409283ae4f707792261d865994"} Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.455191 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6hxvm"] Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.459102 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6hxvm"] Dec 02 18:21:22 crc kubenswrapper[4878]: I1202 18:21:22.947273 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763bb008-97f2-4e90-965c-5a7537ff0a57" path="/var/lib/kubelet/pods/763bb008-97f2-4e90-965c-5a7537ff0a57/volumes" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.742403 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.742515 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.924028 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-785bd4864d-2j5ml"] Dec 02 18:21:23 crc kubenswrapper[4878]: E1202 18:21:23.924331 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763bb008-97f2-4e90-965c-5a7537ff0a57" containerName="registry" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.924348 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="763bb008-97f2-4e90-965c-5a7537ff0a57" containerName="registry" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.924465 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="763bb008-97f2-4e90-965c-5a7537ff0a57" containerName="registry" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.924941 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.956092 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785bd4864d-2j5ml"] Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.982443 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-trusted-ca-bundle\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.982508 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-oauth-config\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.982552 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-service-ca\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.982597 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-config\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.982619 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqklr\" (UniqueName: \"kubernetes.io/projected/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-kube-api-access-pqklr\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.982645 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-serving-cert\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:23 crc kubenswrapper[4878]: I1202 18:21:23.982673 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-oauth-serving-cert\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.084399 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-trusted-ca-bundle\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.084471 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-oauth-config\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.084516 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-service-ca\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.084556 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqklr\" (UniqueName: \"kubernetes.io/projected/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-kube-api-access-pqklr\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.084582 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-config\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.084615 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-serving-cert\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.084645 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-oauth-serving-cert\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.085561 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-service-ca\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.085677 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-oauth-serving-cert\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.085847 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-config\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.086316 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-trusted-ca-bundle\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.094858 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-oauth-config\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.094858 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-serving-cert\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.108069 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqklr\" (UniqueName: \"kubernetes.io/projected/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-kube-api-access-pqklr\") pod \"console-785bd4864d-2j5ml\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.253834 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.434366 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6b9b6b68f7-hzk42"] Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.435482 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.440538 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-tlcqc" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.440845 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.441030 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-4h55t2p9q2f66" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.441294 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.441456 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.441586 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.464644 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b9b6b68f7-hzk42"] Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.495853 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-secret-metrics-client-certs\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.496441 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-secret-metrics-server-tls\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.496479 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/396a8ffe-50ce-46a4-8329-4dbb0e88093e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.496529 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbrl\" (UniqueName: \"kubernetes.io/projected/396a8ffe-50ce-46a4-8329-4dbb0e88093e-kube-api-access-5qbrl\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.496566 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-client-ca-bundle\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.496596 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/396a8ffe-50ce-46a4-8329-4dbb0e88093e-metrics-server-audit-profiles\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.496622 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/396a8ffe-50ce-46a4-8329-4dbb0e88093e-audit-log\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598149 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbrl\" (UniqueName: \"kubernetes.io/projected/396a8ffe-50ce-46a4-8329-4dbb0e88093e-kube-api-access-5qbrl\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598259 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-client-ca-bundle\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598301 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/396a8ffe-50ce-46a4-8329-4dbb0e88093e-metrics-server-audit-profiles\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598337 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/396a8ffe-50ce-46a4-8329-4dbb0e88093e-audit-log\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598373 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-secret-metrics-client-certs\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598446 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-secret-metrics-server-tls\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598464 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/396a8ffe-50ce-46a4-8329-4dbb0e88093e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.598982 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/396a8ffe-50ce-46a4-8329-4dbb0e88093e-audit-log\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.599394 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/396a8ffe-50ce-46a4-8329-4dbb0e88093e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.600202 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/396a8ffe-50ce-46a4-8329-4dbb0e88093e-metrics-server-audit-profiles\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.603219 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-secret-metrics-server-tls\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.603900 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-client-ca-bundle\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.604942 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/396a8ffe-50ce-46a4-8329-4dbb0e88093e-secret-metrics-client-certs\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.622543 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbrl\" (UniqueName: \"kubernetes.io/projected/396a8ffe-50ce-46a4-8329-4dbb0e88093e-kube-api-access-5qbrl\") pod \"metrics-server-6b9b6b68f7-hzk42\" (UID: \"396a8ffe-50ce-46a4-8329-4dbb0e88093e\") " pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.760827 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.887367 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh"] Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.888354 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.895873 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.896774 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.899432 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh"] Dec 02 18:21:24 crc kubenswrapper[4878]: I1202 18:21:24.903531 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b1f4076b-29da-4a52-ba24-135ccb4acf7c-monitoring-plugin-cert\") pod \"monitoring-plugin-7bdcc8ddd8-rlsrh\" (UID: \"b1f4076b-29da-4a52-ba24-135ccb4acf7c\") " pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.004589 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b1f4076b-29da-4a52-ba24-135ccb4acf7c-monitoring-plugin-cert\") pod \"monitoring-plugin-7bdcc8ddd8-rlsrh\" (UID: \"b1f4076b-29da-4a52-ba24-135ccb4acf7c\") " pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.009443 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b1f4076b-29da-4a52-ba24-135ccb4acf7c-monitoring-plugin-cert\") pod \"monitoring-plugin-7bdcc8ddd8-rlsrh\" (UID: \"b1f4076b-29da-4a52-ba24-135ccb4acf7c\") " pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.218961 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.491547 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.494491 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.498848 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.498879 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.508196 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.546189 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-3s8nlc7d0fnjt" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.546411 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.546489 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.546411 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-bqj6r" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.546628 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.546683 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.547181 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.547199 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.547349 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.548948 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.549785 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615697 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615762 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615790 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615809 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615828 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615844 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-config\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615864 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615880 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2226c71-b37b-4056-adef-f28b37d837ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615901 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615926 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615946 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615971 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.615999 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2226c71-b37b-4056-adef-f28b37d837ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.616024 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.616045 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.616072 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.616089 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.616109 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpndn\" (UniqueName: \"kubernetes.io/projected/a2226c71-b37b-4056-adef-f28b37d837ad-kube-api-access-tpndn\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717417 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2226c71-b37b-4056-adef-f28b37d837ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717520 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717554 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717574 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717617 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717650 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2226c71-b37b-4056-adef-f28b37d837ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717689 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717710 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717733 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717753 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717773 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpndn\" (UniqueName: \"kubernetes.io/projected/a2226c71-b37b-4056-adef-f28b37d837ad-kube-api-access-tpndn\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717799 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717836 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717862 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717920 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717947 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717968 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-config\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.717995 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.718831 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.719413 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.719704 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.720521 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.721863 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.723175 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.723465 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.723528 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.724521 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2226c71-b37b-4056-adef-f28b37d837ad-config-out\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.724557 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-web-config\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.724755 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2226c71-b37b-4056-adef-f28b37d837ad-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.724941 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.725342 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.725731 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.726664 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-config\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.731405 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a2226c71-b37b-4056-adef-f28b37d837ad-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.733432 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a2226c71-b37b-4056-adef-f28b37d837ad-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.741657 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpndn\" (UniqueName: \"kubernetes.io/projected/a2226c71-b37b-4056-adef-f28b37d837ad-kube-api-access-tpndn\") pod \"prometheus-k8s-0\" (UID: \"a2226c71-b37b-4056-adef-f28b37d837ad\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:25 crc kubenswrapper[4878]: I1202 18:21:25.864810 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.095609 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785bd4864d-2j5ml"] Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.203580 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b9b6b68f7-hzk42"] Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.302804 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.306736 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh"] Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.478917 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" event={"ID":"396a8ffe-50ce-46a4-8329-4dbb0e88093e","Type":"ContainerStarted","Data":"c059965ac44de848bba3fd73fa5e7a8ba50fd950596211d942e07f2e550aee80"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.481443 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" event={"ID":"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d","Type":"ContainerStarted","Data":"7124ccc6955b3699d05279f3562098b2b6c91f1ad37bc266d90f503b80631625"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.481505 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" event={"ID":"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d","Type":"ContainerStarted","Data":"ff8f0accbecfa610d2bacad9cad2769d84b3d42a981bc5d10c4b122892cedaa8"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.484255 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785bd4864d-2j5ml" event={"ID":"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8","Type":"ContainerStarted","Data":"070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.484333 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785bd4864d-2j5ml" event={"ID":"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8","Type":"ContainerStarted","Data":"8e3de93f300b344c1499d3f48e771b12ea869fbe781a8b3826f25742be144c11"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.485538 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" event={"ID":"b1f4076b-29da-4a52-ba24-135ccb4acf7c","Type":"ContainerStarted","Data":"c515031de62a9229ab61611e2bbfb7dfca13bca8643293fa855b1e670bf5385c"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.487481 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerStarted","Data":"122c2d1169954dd1adc80fb0d82d1d0c75a72ad535f114876670e29bebe5cfcc"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.489496 4878 generic.go:334] "Generic (PLEG): container finished" podID="87e44891-f65f-4630-bd14-a6d663c91d71" containerID="7264946cb36098d7fdbaa3632f1460742fbb1667dcf74e98763bbbe6b4a0ce05" exitCode=0 Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.489575 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerDied","Data":"7264946cb36098d7fdbaa3632f1460742fbb1667dcf74e98763bbbe6b4a0ce05"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.496032 4878 generic.go:334] "Generic (PLEG): container finished" podID="83601880-99db-414c-a8c9-08211fd9e54d" containerID="aa0fca1f09396208ba3f9e90f1fe8d9c6fa95e89ae64ba5055691bc0ef9d831f" exitCode=0 Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.496123 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zp7j" event={"ID":"83601880-99db-414c-a8c9-08211fd9e54d","Type":"ContainerDied","Data":"aa0fca1f09396208ba3f9e90f1fe8d9c6fa95e89ae64ba5055691bc0ef9d831f"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.499151 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" event={"ID":"ea96e843-9776-4b36-84c5-d4b187c87720","Type":"ContainerStarted","Data":"10692640229c7221e74b13b69ac8f5d2406ffafc3c782ca5058baf7ef069adf8"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.499202 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" event={"ID":"ea96e843-9776-4b36-84c5-d4b187c87720","Type":"ContainerStarted","Data":"98aadd8d201a4a647d92440f42020ba2b8138d23bae7afd1dd9588815424995f"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.501816 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" event={"ID":"568128a9-6c26-450b-acf5-260cf1f9ec57","Type":"ContainerStarted","Data":"d2a09b9e0caff8599927baccc2e071a3a2e403d62a38e3d469c7a3e9e7023139"} Dec 02 18:21:27 crc kubenswrapper[4878]: I1202 18:21:27.562101 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-hqt75" podStartSLOduration=3.705254154 podStartE2EDuration="8.562077052s" podCreationTimestamp="2025-12-02 18:21:19 +0000 UTC" firstStartedPulling="2025-12-02 18:21:21.915122526 +0000 UTC m=+391.604741417" lastFinishedPulling="2025-12-02 18:21:26.771945434 +0000 UTC m=+396.461564315" observedRunningTime="2025-12-02 18:21:27.561848785 +0000 UTC m=+397.251467846" watchObservedRunningTime="2025-12-02 18:21:27.562077052 +0000 UTC m=+397.251695933" Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.515146 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" event={"ID":"1560ebec-22c0-4ea7-81f9-3fd7c9b9629d","Type":"ContainerStarted","Data":"cd7d817788cb125d8e88c1d644668370f6048c45eddd621bae25fcc90dfa59d9"} Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.523982 4878 generic.go:334] "Generic (PLEG): container finished" podID="a2226c71-b37b-4056-adef-f28b37d837ad" containerID="fd3068ef30d88584638061ac11274b0cbb8840b51a395b4f0c81845cc13fb6be" exitCode=0 Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.524071 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerDied","Data":"fd3068ef30d88584638061ac11274b0cbb8840b51a395b4f0c81845cc13fb6be"} Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.529051 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" event={"ID":"ea96e843-9776-4b36-84c5-d4b187c87720","Type":"ContainerStarted","Data":"e448a55a971ed523b3f8068129db105cc41399b67824abca8f83655846b0c37d"} Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.532871 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zp7j" event={"ID":"83601880-99db-414c-a8c9-08211fd9e54d","Type":"ContainerStarted","Data":"2b3ab398069d74964106941a7dab7be2d9d16f8ec22205f173c5ee47ccec9154"} Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.532919 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6zp7j" event={"ID":"83601880-99db-414c-a8c9-08211fd9e54d","Type":"ContainerStarted","Data":"237c2859f6b2571f3ef135ee899471d795e65452b81b5054da08887db7b4772c"} Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.547190 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xh2w4" podStartSLOduration=2.866775634 podStartE2EDuration="9.547135246s" podCreationTimestamp="2025-12-02 18:21:19 +0000 UTC" firstStartedPulling="2025-12-02 18:21:19.985418509 +0000 UTC m=+389.675037400" lastFinishedPulling="2025-12-02 18:21:26.665778091 +0000 UTC m=+396.355397012" observedRunningTime="2025-12-02 18:21:28.53609584 +0000 UTC m=+398.225714741" watchObservedRunningTime="2025-12-02 18:21:28.547135246 +0000 UTC m=+398.236754127" Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.626004 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-785bd4864d-2j5ml" podStartSLOduration=5.625971818 podStartE2EDuration="5.625971818s" podCreationTimestamp="2025-12-02 18:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:21:28.623555289 +0000 UTC m=+398.313174180" watchObservedRunningTime="2025-12-02 18:21:28.625971818 +0000 UTC m=+398.315590709" Dec 02 18:21:28 crc kubenswrapper[4878]: I1202 18:21:28.626860 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6zp7j" podStartSLOduration=2.525540123 podStartE2EDuration="9.626852235s" podCreationTimestamp="2025-12-02 18:21:19 +0000 UTC" firstStartedPulling="2025-12-02 18:21:19.53441338 +0000 UTC m=+389.224032261" lastFinishedPulling="2025-12-02 18:21:26.635725492 +0000 UTC m=+396.325344373" observedRunningTime="2025-12-02 18:21:28.6027915 +0000 UTC m=+398.292410381" watchObservedRunningTime="2025-12-02 18:21:28.626852235 +0000 UTC m=+398.316471116" Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.559392 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" event={"ID":"b1f4076b-29da-4a52-ba24-135ccb4acf7c","Type":"ContainerStarted","Data":"8c36e1802f47ba64ceecd0d16df77a919c56b1d4eff0ce28bb016e0005ce9332"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.560435 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.567555 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" event={"ID":"ea96e843-9776-4b36-84c5-d4b187c87720","Type":"ContainerStarted","Data":"d066c23940977a83f464c32048a52dbb0fca267e6aeb5d1babc07835142ba978"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.567617 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" event={"ID":"ea96e843-9776-4b36-84c5-d4b187c87720","Type":"ContainerStarted","Data":"6a5766c6c61bab44758f17dd954e6df5a8eef50955dafeed2d9b33eb20394b99"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.567638 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" event={"ID":"ea96e843-9776-4b36-84c5-d4b187c87720","Type":"ContainerStarted","Data":"440aeb34c439556b166b0a006c57ffb18b094e08c11e9668daa7a333ce654f78"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.567747 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.579704 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerStarted","Data":"0f25e0b40880fdba7827bb02bd830d9451e1640241dd5e4ccd0917a74c5ef87d"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.579989 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerStarted","Data":"8841b3d7f475c88c2847a19921cadcafff543765d59c4f4eb6d054f069f4be55"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.580021 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerStarted","Data":"bb36ca176df49346e62d429df748c529d54659d7489b933e566c80679efc47fc"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.580581 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.583082 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7bdcc8ddd8-rlsrh" podStartSLOduration=4.082583556 podStartE2EDuration="7.583014237s" podCreationTimestamp="2025-12-02 18:21:24 +0000 UTC" firstStartedPulling="2025-12-02 18:21:27.318374564 +0000 UTC m=+397.007993445" lastFinishedPulling="2025-12-02 18:21:30.818805235 +0000 UTC m=+400.508424126" observedRunningTime="2025-12-02 18:21:31.576924131 +0000 UTC m=+401.266543022" watchObservedRunningTime="2025-12-02 18:21:31.583014237 +0000 UTC m=+401.272633118" Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.583565 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" event={"ID":"396a8ffe-50ce-46a4-8329-4dbb0e88093e","Type":"ContainerStarted","Data":"3684c81410b9b5d6d66a9f539ce77a4b2b23d5e49fa8bdb5b5c3ea73091b4175"} Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.609379 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" podStartSLOduration=1.620283004 podStartE2EDuration="10.609351657s" podCreationTimestamp="2025-12-02 18:21:21 +0000 UTC" firstStartedPulling="2025-12-02 18:21:21.822939164 +0000 UTC m=+391.512558045" lastFinishedPulling="2025-12-02 18:21:30.812007817 +0000 UTC m=+400.501626698" observedRunningTime="2025-12-02 18:21:31.60170558 +0000 UTC m=+401.291324461" watchObservedRunningTime="2025-12-02 18:21:31.609351657 +0000 UTC m=+401.298970528" Dec 02 18:21:31 crc kubenswrapper[4878]: I1202 18:21:31.623394 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" podStartSLOduration=4.079190297 podStartE2EDuration="7.623365039s" podCreationTimestamp="2025-12-02 18:21:24 +0000 UTC" firstStartedPulling="2025-12-02 18:21:27.289334397 +0000 UTC m=+396.978953278" lastFinishedPulling="2025-12-02 18:21:30.833509139 +0000 UTC m=+400.523128020" observedRunningTime="2025-12-02 18:21:31.620562108 +0000 UTC m=+401.310181019" watchObservedRunningTime="2025-12-02 18:21:31.623365039 +0000 UTC m=+401.312983920" Dec 02 18:21:32 crc kubenswrapper[4878]: I1202 18:21:32.608936 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.609783 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerStarted","Data":"449e5f98cd5fcb0378cbe3c36167ac1f3c4d72e69c5c9ded1babe34fcba079a1"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.609831 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerStarted","Data":"a12bcc7f930ef221b0b6cab69a8b98be60878ddd4281cece04a05b5abc84514e"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.609841 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerStarted","Data":"561b8276e265aa87d7cdc8b2bb3bd28bbb7ca563452d01196dcea50c7bf53590"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.609850 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerStarted","Data":"bc9d1da05601e22d8bc69d8fccc8e650bd9fd91684b63495d06b74ad04740fb9"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.609858 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerStarted","Data":"84e81c1582d0cdd284be6518a3604b33add3240dc178d772911bf1e63fb658f8"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.609868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2226c71-b37b-4056-adef-f28b37d837ad","Type":"ContainerStarted","Data":"524d38f8e79468952a3447c6c13d4dd7bcd910d7580519596851e36ca7cf550d"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.617557 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerStarted","Data":"f854ffabbf265d5432bcc23c9a5a19879693a8fce842b3bd21f0b3ab63ef7e5c"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.617666 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerStarted","Data":"d7d68e7ee7e5cc210c986cf247ebce03870a1e646d92510532dec19c4d5a1f61"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.617691 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"87e44891-f65f-4630-bd14-a6d663c91d71","Type":"ContainerStarted","Data":"ea653b6337ac69ec2555c1bcb2479173c9463a5d748b15ba84ecd42744c0dc0b"} Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.652964 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.6764656989999995 podStartE2EDuration="8.65293318s" podCreationTimestamp="2025-12-02 18:21:25 +0000 UTC" firstStartedPulling="2025-12-02 18:21:28.526371376 +0000 UTC m=+398.215990267" lastFinishedPulling="2025-12-02 18:21:32.502838867 +0000 UTC m=+402.192457748" observedRunningTime="2025-12-02 18:21:33.649869151 +0000 UTC m=+403.339488072" watchObservedRunningTime="2025-12-02 18:21:33.65293318 +0000 UTC m=+403.342552101" Dec 02 18:21:33 crc kubenswrapper[4878]: I1202 18:21:33.696298 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.873418 podStartE2EDuration="13.696277188s" podCreationTimestamp="2025-12-02 18:21:20 +0000 UTC" firstStartedPulling="2025-12-02 18:21:20.993062254 +0000 UTC m=+390.682681175" lastFinishedPulling="2025-12-02 18:21:30.815921482 +0000 UTC m=+400.505540363" observedRunningTime="2025-12-02 18:21:33.690172581 +0000 UTC m=+403.379791472" watchObservedRunningTime="2025-12-02 18:21:33.696277188 +0000 UTC m=+403.385896079" Dec 02 18:21:34 crc kubenswrapper[4878]: I1202 18:21:34.254632 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:34 crc kubenswrapper[4878]: I1202 18:21:34.255039 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:34 crc kubenswrapper[4878]: I1202 18:21:34.262264 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:34 crc kubenswrapper[4878]: I1202 18:21:34.633716 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:21:34 crc kubenswrapper[4878]: I1202 18:21:34.714210 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-w6mf2"] Dec 02 18:21:35 crc kubenswrapper[4878]: I1202 18:21:35.864942 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:21:44 crc kubenswrapper[4878]: I1202 18:21:44.761900 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:44 crc kubenswrapper[4878]: I1202 18:21:44.762595 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:21:53 crc kubenswrapper[4878]: I1202 18:21:53.742165 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:21:53 crc kubenswrapper[4878]: I1202 18:21:53.742976 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:21:53 crc kubenswrapper[4878]: I1202 18:21:53.743047 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:21:53 crc kubenswrapper[4878]: I1202 18:21:53.744027 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a12b6a48d5fa299bcb38ae1b9a61925e1420e83b36412a67e4692078c7172bd"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:21:53 crc kubenswrapper[4878]: I1202 18:21:53.744113 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://4a12b6a48d5fa299bcb38ae1b9a61925e1420e83b36412a67e4692078c7172bd" gracePeriod=600 Dec 02 18:21:54 crc kubenswrapper[4878]: I1202 18:21:54.799783 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="4a12b6a48d5fa299bcb38ae1b9a61925e1420e83b36412a67e4692078c7172bd" exitCode=0 Dec 02 18:21:54 crc kubenswrapper[4878]: I1202 18:21:54.800388 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"4a12b6a48d5fa299bcb38ae1b9a61925e1420e83b36412a67e4692078c7172bd"} Dec 02 18:21:54 crc kubenswrapper[4878]: I1202 18:21:54.800432 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"3a28b486a2e75984b9b969c8ed5539eb39300fac74b0f7f88830322d2c2039ac"} Dec 02 18:21:54 crc kubenswrapper[4878]: I1202 18:21:54.800454 4878 scope.go:117] "RemoveContainer" containerID="b7709f4d969504362d28f3f061837bbc41e477c91ac9e2a7abacbce612a1aa73" Dec 02 18:21:59 crc kubenswrapper[4878]: I1202 18:21:59.772926 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-w6mf2" podUID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" containerName="console" containerID="cri-o://811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7" gracePeriod=15 Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.245787 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-w6mf2_94d6b1b1-ad1b-45c3-9947-92345aa1e5a2/console/0.log" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.245878 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.382523 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-oauth-serving-cert\") pod \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.382672 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-oauth-config\") pod \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.382753 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-serving-cert\") pod \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.382804 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-447rt\" (UniqueName: \"kubernetes.io/projected/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-kube-api-access-447rt\") pod \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.382863 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-service-ca\") pod \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.383001 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-trusted-ca-bundle\") pod \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.383072 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-config\") pod \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\" (UID: \"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2\") " Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.383329 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" (UID: "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.384031 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-service-ca" (OuterVolumeSpecName: "service-ca") pod "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" (UID: "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.384135 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" (UID: "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.384308 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-config" (OuterVolumeSpecName: "console-config") pod "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" (UID: "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.393957 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-kube-api-access-447rt" (OuterVolumeSpecName: "kube-api-access-447rt") pod "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" (UID: "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2"). InnerVolumeSpecName "kube-api-access-447rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.394614 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" (UID: "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.395295 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" (UID: "94d6b1b1-ad1b-45c3-9947-92345aa1e5a2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.485340 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.485385 4878 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.485400 4878 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.485416 4878 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.485429 4878 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.485442 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-447rt\" (UniqueName: \"kubernetes.io/projected/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-kube-api-access-447rt\") on node \"crc\" DevicePath \"\"" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.485461 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.855616 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-w6mf2_94d6b1b1-ad1b-45c3-9947-92345aa1e5a2/console/0.log" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.855696 4878 generic.go:334] "Generic (PLEG): container finished" podID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" containerID="811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7" exitCode=2 Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.855749 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w6mf2" event={"ID":"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2","Type":"ContainerDied","Data":"811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7"} Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.855788 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w6mf2" event={"ID":"94d6b1b1-ad1b-45c3-9947-92345aa1e5a2","Type":"ContainerDied","Data":"634a727576962265b23fd0c54a5939f89f3444158a1a3b6b8cf693546076f0bb"} Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.855820 4878 scope.go:117] "RemoveContainer" containerID="811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.855850 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w6mf2" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.879955 4878 scope.go:117] "RemoveContainer" containerID="811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7" Dec 02 18:22:00 crc kubenswrapper[4878]: E1202 18:22:00.880725 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7\": container with ID starting with 811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7 not found: ID does not exist" containerID="811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.880768 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7"} err="failed to get container status \"811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7\": rpc error: code = NotFound desc = could not find container \"811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7\": container with ID starting with 811666e579d2973eb8cb95e421cc6da55db33ad015216805f6d512c49a73efd7 not found: ID does not exist" Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.908485 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-w6mf2"] Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.916383 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-w6mf2"] Dec 02 18:22:00 crc kubenswrapper[4878]: I1202 18:22:00.946778 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" path="/var/lib/kubelet/pods/94d6b1b1-ad1b-45c3-9947-92345aa1e5a2/volumes" Dec 02 18:22:04 crc kubenswrapper[4878]: I1202 18:22:04.770402 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:22:04 crc kubenswrapper[4878]: I1202 18:22:04.779439 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6b9b6b68f7-hzk42" Dec 02 18:22:25 crc kubenswrapper[4878]: I1202 18:22:25.865723 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:22:25 crc kubenswrapper[4878]: I1202 18:22:25.907051 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:22:26 crc kubenswrapper[4878]: I1202 18:22:26.090932 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.138590 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bc54f4949-8wvcm"] Dec 02 18:23:31 crc kubenswrapper[4878]: E1202 18:23:31.139949 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" containerName="console" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.139977 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" containerName="console" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.140292 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d6b1b1-ad1b-45c3-9947-92345aa1e5a2" containerName="console" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.140951 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.147402 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bc54f4949-8wvcm"] Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.264282 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-serving-cert\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.264352 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-service-ca\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.265064 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmjm\" (UniqueName: \"kubernetes.io/projected/8c205c22-515e-4834-a53d-30d85e34596f-kube-api-access-hmmjm\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.265470 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-trusted-ca-bundle\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.265591 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-oauth-config\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.265669 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-oauth-serving-cert\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.265745 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-console-config\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.367405 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmmjm\" (UniqueName: \"kubernetes.io/projected/8c205c22-515e-4834-a53d-30d85e34596f-kube-api-access-hmmjm\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.367484 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-trusted-ca-bundle\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.367531 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-oauth-config\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.367571 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-oauth-serving-cert\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.367609 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-console-config\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.367672 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-serving-cert\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.367706 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-service-ca\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.369116 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-console-config\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.369844 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-trusted-ca-bundle\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.370090 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-service-ca\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.370791 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-oauth-serving-cert\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.376834 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-serving-cert\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.377139 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-oauth-config\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.391520 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmmjm\" (UniqueName: \"kubernetes.io/projected/8c205c22-515e-4834-a53d-30d85e34596f-kube-api-access-hmmjm\") pod \"console-7bc54f4949-8wvcm\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.482170 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:31 crc kubenswrapper[4878]: I1202 18:23:31.750852 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bc54f4949-8wvcm"] Dec 02 18:23:32 crc kubenswrapper[4878]: I1202 18:23:32.592572 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc54f4949-8wvcm" event={"ID":"8c205c22-515e-4834-a53d-30d85e34596f","Type":"ContainerStarted","Data":"5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340"} Dec 02 18:23:32 crc kubenswrapper[4878]: I1202 18:23:32.592659 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc54f4949-8wvcm" event={"ID":"8c205c22-515e-4834-a53d-30d85e34596f","Type":"ContainerStarted","Data":"e80bbf64fbf2f477b456d938d5240d83c2bf6cd330a3436abcaaed3406b13923"} Dec 02 18:23:32 crc kubenswrapper[4878]: I1202 18:23:32.626476 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bc54f4949-8wvcm" podStartSLOduration=1.626448092 podStartE2EDuration="1.626448092s" podCreationTimestamp="2025-12-02 18:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:23:32.62447123 +0000 UTC m=+522.314090131" watchObservedRunningTime="2025-12-02 18:23:32.626448092 +0000 UTC m=+522.316067013" Dec 02 18:23:41 crc kubenswrapper[4878]: I1202 18:23:41.482657 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:41 crc kubenswrapper[4878]: I1202 18:23:41.483362 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:41 crc kubenswrapper[4878]: I1202 18:23:41.490493 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:41 crc kubenswrapper[4878]: I1202 18:23:41.674352 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:23:41 crc kubenswrapper[4878]: I1202 18:23:41.755619 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-785bd4864d-2j5ml"] Dec 02 18:24:06 crc kubenswrapper[4878]: I1202 18:24:06.831228 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-785bd4864d-2j5ml" podUID="4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" containerName="console" containerID="cri-o://070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d" gracePeriod=15 Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.191996 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-785bd4864d-2j5ml_4d1e5f9e-aab9-4c42-8f60-af83c51a67d8/console/0.log" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.192618 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.310114 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-serving-cert\") pod \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.310222 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-trusted-ca-bundle\") pod \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.310301 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-oauth-config\") pod \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.310340 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqklr\" (UniqueName: \"kubernetes.io/projected/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-kube-api-access-pqklr\") pod \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.310654 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-oauth-serving-cert\") pod \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.311605 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-service-ca\") pod \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.311726 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-config\") pod \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\" (UID: \"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8\") " Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.311613 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" (UID: "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.311881 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" (UID: "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.312171 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-service-ca" (OuterVolumeSpecName: "service-ca") pod "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" (UID: "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.312429 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-config" (OuterVolumeSpecName: "console-config") pod "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" (UID: "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.312677 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.312729 4878 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.312744 4878 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.312805 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.317346 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" (UID: "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.317780 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-kube-api-access-pqklr" (OuterVolumeSpecName: "kube-api-access-pqklr") pod "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" (UID: "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8"). InnerVolumeSpecName "kube-api-access-pqklr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.319347 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" (UID: "4d1e5f9e-aab9-4c42-8f60-af83c51a67d8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.413361 4878 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.413398 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqklr\" (UniqueName: \"kubernetes.io/projected/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-kube-api-access-pqklr\") on node \"crc\" DevicePath \"\"" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.413412 4878 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.869493 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-785bd4864d-2j5ml_4d1e5f9e-aab9-4c42-8f60-af83c51a67d8/console/0.log" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.869601 4878 generic.go:334] "Generic (PLEG): container finished" podID="4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" containerID="070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d" exitCode=2 Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.869674 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785bd4864d-2j5ml" event={"ID":"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8","Type":"ContainerDied","Data":"070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d"} Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.869715 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785bd4864d-2j5ml" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.869727 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785bd4864d-2j5ml" event={"ID":"4d1e5f9e-aab9-4c42-8f60-af83c51a67d8","Type":"ContainerDied","Data":"8e3de93f300b344c1499d3f48e771b12ea869fbe781a8b3826f25742be144c11"} Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.869768 4878 scope.go:117] "RemoveContainer" containerID="070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.921734 4878 scope.go:117] "RemoveContainer" containerID="070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d" Dec 02 18:24:07 crc kubenswrapper[4878]: E1202 18:24:07.922674 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d\": container with ID starting with 070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d not found: ID does not exist" containerID="070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.922750 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d"} err="failed to get container status \"070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d\": rpc error: code = NotFound desc = could not find container \"070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d\": container with ID starting with 070e04ca7cb0f9c81cc9b91a7b85d1b3430cc3884ff73ffb28edc63d2046952d not found: ID does not exist" Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.930946 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-785bd4864d-2j5ml"] Dec 02 18:24:07 crc kubenswrapper[4878]: I1202 18:24:07.938422 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-785bd4864d-2j5ml"] Dec 02 18:24:08 crc kubenswrapper[4878]: I1202 18:24:08.962401 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" path="/var/lib/kubelet/pods/4d1e5f9e-aab9-4c42-8f60-af83c51a67d8/volumes" Dec 02 18:24:23 crc kubenswrapper[4878]: I1202 18:24:23.741877 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:24:23 crc kubenswrapper[4878]: I1202 18:24:23.742657 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:24:51 crc kubenswrapper[4878]: I1202 18:24:51.217145 4878 scope.go:117] "RemoveContainer" containerID="9aef6ac073ae85a01b8df88b56045fa3f3e922cb9992d6b704643e62bd733bc8" Dec 02 18:24:53 crc kubenswrapper[4878]: I1202 18:24:53.742965 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:24:53 crc kubenswrapper[4878]: I1202 18:24:53.743613 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:25:23 crc kubenswrapper[4878]: I1202 18:25:23.742602 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:25:23 crc kubenswrapper[4878]: I1202 18:25:23.743346 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:25:23 crc kubenswrapper[4878]: I1202 18:25:23.743396 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:25:23 crc kubenswrapper[4878]: I1202 18:25:23.744121 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a28b486a2e75984b9b969c8ed5539eb39300fac74b0f7f88830322d2c2039ac"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:25:23 crc kubenswrapper[4878]: I1202 18:25:23.744182 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://3a28b486a2e75984b9b969c8ed5539eb39300fac74b0f7f88830322d2c2039ac" gracePeriod=600 Dec 02 18:25:24 crc kubenswrapper[4878]: I1202 18:25:24.465402 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="3a28b486a2e75984b9b969c8ed5539eb39300fac74b0f7f88830322d2c2039ac" exitCode=0 Dec 02 18:25:24 crc kubenswrapper[4878]: I1202 18:25:24.465487 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"3a28b486a2e75984b9b969c8ed5539eb39300fac74b0f7f88830322d2c2039ac"} Dec 02 18:25:24 crc kubenswrapper[4878]: I1202 18:25:24.466260 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"b46b425128f8cb8574d53391fe3090841c533ef0911e243412874ecbe8a5c8b9"} Dec 02 18:25:24 crc kubenswrapper[4878]: I1202 18:25:24.466298 4878 scope.go:117] "RemoveContainer" containerID="4a12b6a48d5fa299bcb38ae1b9a61925e1420e83b36412a67e4692078c7172bd" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.875414 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb"] Dec 02 18:25:51 crc kubenswrapper[4878]: E1202 18:25:51.876823 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" containerName="console" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.876849 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" containerName="console" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.877071 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1e5f9e-aab9-4c42-8f60-af83c51a67d8" containerName="console" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.878718 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.885227 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb"] Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.885375 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.929453 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.930024 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4z77\" (UniqueName: \"kubernetes.io/projected/28d45a6b-6f64-487d-bac3-60ce7a1321f7-kube-api-access-r4z77\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:51 crc kubenswrapper[4878]: I1202 18:25:51.930080 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.031992 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4z77\" (UniqueName: \"kubernetes.io/projected/28d45a6b-6f64-487d-bac3-60ce7a1321f7-kube-api-access-r4z77\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.032071 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.032160 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.032853 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.033001 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.069215 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4z77\" (UniqueName: \"kubernetes.io/projected/28d45a6b-6f64-487d-bac3-60ce7a1321f7-kube-api-access-r4z77\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.210900 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.638388 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb"] Dec 02 18:25:52 crc kubenswrapper[4878]: I1202 18:25:52.714105 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" event={"ID":"28d45a6b-6f64-487d-bac3-60ce7a1321f7","Type":"ContainerStarted","Data":"52d07ccd5eb64c32d7ab2586670571776a7789a2f6d362819c6122c011d0e9f7"} Dec 02 18:25:53 crc kubenswrapper[4878]: I1202 18:25:53.722755 4878 generic.go:334] "Generic (PLEG): container finished" podID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerID="f68e4a44573d06861abe671333748adae22001d36f69d8bd9c8dfcb9d96e5201" exitCode=0 Dec 02 18:25:53 crc kubenswrapper[4878]: I1202 18:25:53.722824 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" event={"ID":"28d45a6b-6f64-487d-bac3-60ce7a1321f7","Type":"ContainerDied","Data":"f68e4a44573d06861abe671333748adae22001d36f69d8bd9c8dfcb9d96e5201"} Dec 02 18:25:53 crc kubenswrapper[4878]: I1202 18:25:53.726113 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 18:25:55 crc kubenswrapper[4878]: I1202 18:25:55.742369 4878 generic.go:334] "Generic (PLEG): container finished" podID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerID="96142a4de8aac6d21bbbb9d611e47da9540438a2c3cff2abc13d361b5a084311" exitCode=0 Dec 02 18:25:55 crc kubenswrapper[4878]: I1202 18:25:55.742432 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" event={"ID":"28d45a6b-6f64-487d-bac3-60ce7a1321f7","Type":"ContainerDied","Data":"96142a4de8aac6d21bbbb9d611e47da9540438a2c3cff2abc13d361b5a084311"} Dec 02 18:25:56 crc kubenswrapper[4878]: I1202 18:25:56.754169 4878 generic.go:334] "Generic (PLEG): container finished" podID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerID="47767bb72b82c4e4f37da7a46c88a6d9189065090d0cb1ca93718a7dbc755e64" exitCode=0 Dec 02 18:25:56 crc kubenswrapper[4878]: I1202 18:25:56.754314 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" event={"ID":"28d45a6b-6f64-487d-bac3-60ce7a1321f7","Type":"ContainerDied","Data":"47767bb72b82c4e4f37da7a46c88a6d9189065090d0cb1ca93718a7dbc755e64"} Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.076057 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.134918 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-bundle\") pod \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.135562 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4z77\" (UniqueName: \"kubernetes.io/projected/28d45a6b-6f64-487d-bac3-60ce7a1321f7-kube-api-access-r4z77\") pod \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.136654 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-util\") pod \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\" (UID: \"28d45a6b-6f64-487d-bac3-60ce7a1321f7\") " Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.137953 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-bundle" (OuterVolumeSpecName: "bundle") pod "28d45a6b-6f64-487d-bac3-60ce7a1321f7" (UID: "28d45a6b-6f64-487d-bac3-60ce7a1321f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.144165 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d45a6b-6f64-487d-bac3-60ce7a1321f7-kube-api-access-r4z77" (OuterVolumeSpecName: "kube-api-access-r4z77") pod "28d45a6b-6f64-487d-bac3-60ce7a1321f7" (UID: "28d45a6b-6f64-487d-bac3-60ce7a1321f7"). InnerVolumeSpecName "kube-api-access-r4z77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.238639 4878 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.238676 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4z77\" (UniqueName: \"kubernetes.io/projected/28d45a6b-6f64-487d-bac3-60ce7a1321f7-kube-api-access-r4z77\") on node \"crc\" DevicePath \"\"" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.410536 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-util" (OuterVolumeSpecName: "util") pod "28d45a6b-6f64-487d-bac3-60ce7a1321f7" (UID: "28d45a6b-6f64-487d-bac3-60ce7a1321f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.441712 4878 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28d45a6b-6f64-487d-bac3-60ce7a1321f7-util\") on node \"crc\" DevicePath \"\"" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.775178 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" event={"ID":"28d45a6b-6f64-487d-bac3-60ce7a1321f7","Type":"ContainerDied","Data":"52d07ccd5eb64c32d7ab2586670571776a7789a2f6d362819c6122c011d0e9f7"} Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.775569 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52d07ccd5eb64c32d7ab2586670571776a7789a2f6d362819c6122c011d0e9f7" Dec 02 18:25:58 crc kubenswrapper[4878]: I1202 18:25:58.775313 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.038422 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5jzn"] Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.039461 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-controller" containerID="cri-o://eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.039536 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="nbdb" containerID="cri-o://61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.039638 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="northd" containerID="cri-o://90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.039687 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.039720 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-node" containerID="cri-o://f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.039754 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-acl-logging" containerID="cri-o://cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.040035 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="sbdb" containerID="cri-o://c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.100796 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" containerID="cri-o://551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c" gracePeriod=30 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.810799 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovnkube-controller/3.log" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.812997 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovn-acl-logging/0.log" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.813460 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovn-controller/0.log" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.813979 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c" exitCode=0 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814007 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67" exitCode=0 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814016 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d" exitCode=0 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814023 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076" exitCode=0 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814031 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1" exitCode=143 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814038 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024" exitCode=143 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814087 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c"} Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814198 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67"} Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814210 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d"} Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814219 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076"} Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814246 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1"} Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814254 4878 scope.go:117] "RemoveContainer" containerID="91c9c1ddbfb74122a2685e40a0839a5f223260393bdb0c49fed0d027d54c7d01" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.814257 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024"} Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.816365 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/2.log" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.816784 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/1.log" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.816811 4878 generic.go:334] "Generic (PLEG): container finished" podID="e79a8cec-20ba-4862-ba25-7de014466668" containerID="8d8e064c8177248bf254025158f61f6dfa81e6a00b21ef6624c736c7a6a8fdaf" exitCode=2 Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.816837 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerDied","Data":"8d8e064c8177248bf254025158f61f6dfa81e6a00b21ef6624c736c7a6a8fdaf"} Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.817300 4878 scope.go:117] "RemoveContainer" containerID="8d8e064c8177248bf254025158f61f6dfa81e6a00b21ef6624c736c7a6a8fdaf" Dec 02 18:26:03 crc kubenswrapper[4878]: E1202 18:26:03.817675 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6cm9t_openshift-multus(e79a8cec-20ba-4862-ba25-7de014466668)\"" pod="openshift-multus/multus-6cm9t" podUID="e79a8cec-20ba-4862-ba25-7de014466668" Dec 02 18:26:03 crc kubenswrapper[4878]: I1202 18:26:03.841114 4878 scope.go:117] "RemoveContainer" containerID="e529e02dd0b8f5c8da36f38bbb8a040fb48a28444da477652706b9d07793878c" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.740646 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovn-acl-logging/0.log" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.742912 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovn-controller/0.log" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.743544 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.826880 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/2.log" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.831709 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovn-acl-logging/0.log" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.832369 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x5jzn_d160cfa4-9e2a-429d-b760-0cac6d467b9a/ovn-controller/0.log" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.832853 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e" exitCode=0 Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.832956 4878 generic.go:334] "Generic (PLEG): container finished" podID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerID="f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6" exitCode=0 Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.833039 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e"} Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.833145 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6"} Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.833251 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" event={"ID":"d160cfa4-9e2a-429d-b760-0cac6d467b9a","Type":"ContainerDied","Data":"9ea70ad1f9a7692edc4a16dfbffd8396cc7a1c689f7232e16d9834aa0675949a"} Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.833171 4878 scope.go:117] "RemoveContainer" containerID="551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.833150 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5jzn" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855258 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-log-socket\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855307 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-openvswitch\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855355 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855387 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-netns\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855427 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovn-node-metrics-cert\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855410 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-log-socket" (OuterVolumeSpecName: "log-socket") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855483 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855540 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855548 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855512 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855467 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-ovn-kubernetes\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855679 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-script-lib\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855725 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-config\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855767 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzdfp\" (UniqueName: \"kubernetes.io/projected/d160cfa4-9e2a-429d-b760-0cac6d467b9a-kube-api-access-fzdfp\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855784 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-env-overrides\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855815 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-kubelet\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855893 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-systemd-units\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855913 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-node-log\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.855966 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-slash\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856022 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-etc-openvswitch\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856042 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-var-lib-openvswitch\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856059 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-ovn\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856081 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-bin\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856097 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-systemd\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856121 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-netd\") pod \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\" (UID: \"d160cfa4-9e2a-429d-b760-0cac6d467b9a\") " Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856388 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856458 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-node-log" (OuterVolumeSpecName: "node-log") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856412 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856437 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856492 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-slash" (OuterVolumeSpecName: "host-slash") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856525 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856548 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856808 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856820 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856873 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856963 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856978 4878 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.856954 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857035 4878 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857060 4878 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857080 4878 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857096 4878 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857109 4878 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857122 4878 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857134 4878 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857147 4878 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857373 4878 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857421 4878 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857435 4878 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857448 4878 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857461 4878 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.857473 4878 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.858470 4878 scope.go:117] "RemoveContainer" containerID="c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.877502 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.879982 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d160cfa4-9e2a-429d-b760-0cac6d467b9a-kube-api-access-fzdfp" (OuterVolumeSpecName: "kube-api-access-fzdfp") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "kube-api-access-fzdfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.905571 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d160cfa4-9e2a-429d-b760-0cac6d467b9a" (UID: "d160cfa4-9e2a-429d-b760-0cac6d467b9a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.923517 4878 scope.go:117] "RemoveContainer" containerID="61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.943495 4878 scope.go:117] "RemoveContainer" containerID="90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.959588 4878 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.959624 4878 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.959636 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzdfp\" (UniqueName: \"kubernetes.io/projected/d160cfa4-9e2a-429d-b760-0cac6d467b9a-kube-api-access-fzdfp\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.959645 4878 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d160cfa4-9e2a-429d-b760-0cac6d467b9a-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.959658 4878 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d160cfa4-9e2a-429d-b760-0cac6d467b9a-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.965030 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dwtg4"] Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.966432 4878 scope.go:117] "RemoveContainer" containerID="6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.968533 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-node" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.968610 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-node" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.968826 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="nbdb" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.968892 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="nbdb" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.968946 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.968996 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969049 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="northd" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969107 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="northd" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969182 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="sbdb" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969314 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="sbdb" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969392 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969478 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969534 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969579 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969629 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerName="util" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969681 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerName="util" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969732 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969778 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969827 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerName="pull" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969876 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerName="pull" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.969923 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerName="extract" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.969974 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerName="extract" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.970028 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.970078 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.970125 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-acl-logging" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.970174 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-acl-logging" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.970227 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kubecfg-setup" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.970323 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kubecfg-setup" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.970514 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.970996 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="nbdb" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971058 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="sbdb" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971110 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="northd" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971167 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971253 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971321 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971368 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d45a6b-6f64-487d-bac3-60ce7a1321f7" containerName="extract" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971434 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971582 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971639 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovn-acl-logging" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.971756 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="kube-rbac-proxy-node" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.971957 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.972019 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.972197 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: E1202 18:26:04.972385 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.972453 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" containerName="ovnkube-controller" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.974475 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:04 crc kubenswrapper[4878]: I1202 18:26:04.993717 4878 scope.go:117] "RemoveContainer" containerID="f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.017452 4878 scope.go:117] "RemoveContainer" containerID="cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.053035 4878 scope.go:117] "RemoveContainer" containerID="eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061289 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-env-overrides\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061372 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovn-node-metrics-cert\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061427 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-node-log\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061452 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-cni-netd\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061479 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061507 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-run-netns\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061712 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-cni-bin\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061768 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-systemd\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061872 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovnkube-script-lib\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061912 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-kubelet\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061934 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-ovn\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.061951 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-etc-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.062004 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-log-socket\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.062034 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.062062 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkfxm\" (UniqueName: \"kubernetes.io/projected/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-kube-api-access-gkfxm\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.062084 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-systemd-units\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.062105 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovnkube-config\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.062158 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.062176 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-slash\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.083965 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-var-lib-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.107312 4878 scope.go:117] "RemoveContainer" containerID="25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.179496 4878 scope.go:117] "RemoveContainer" containerID="551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.181331 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c\": container with ID starting with 551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c not found: ID does not exist" containerID="551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.181490 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c"} err="failed to get container status \"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c\": rpc error: code = NotFound desc = could not find container \"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c\": container with ID starting with 551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.181632 4878 scope.go:117] "RemoveContainer" containerID="c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.187736 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\": container with ID starting with c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67 not found: ID does not exist" containerID="c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.187807 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67"} err="failed to get container status \"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\": rpc error: code = NotFound desc = could not find container \"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\": container with ID starting with c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.187851 4878 scope.go:117] "RemoveContainer" containerID="61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188179 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-log-socket\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188266 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188312 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkfxm\" (UniqueName: \"kubernetes.io/projected/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-kube-api-access-gkfxm\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188341 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-systemd-units\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188360 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovnkube-config\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188387 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188410 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-slash\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188455 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-var-lib-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188484 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-env-overrides\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188511 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovn-node-metrics-cert\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188543 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-node-log\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188565 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-cni-netd\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188590 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188612 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-run-netns\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188646 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-cni-bin\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188670 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-systemd\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188703 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovnkube-script-lib\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188728 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-kubelet\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188746 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-etc-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188769 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-ovn\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188891 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-ovn\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188957 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-node-log\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.188996 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-cni-netd\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.189026 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-run-ovn-kubernetes\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.189054 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-run-netns\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.189086 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-cni-bin\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.189112 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-systemd\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.189409 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-etc-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.189422 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-kubelet\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.190330 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.190423 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-log-socket\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.190516 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-systemd-units\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.190593 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-host-slash\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.190671 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-run-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.190744 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-var-lib-openvswitch\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.193631 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\": container with ID starting with 61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d not found: ID does not exist" containerID="61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.193700 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d"} err="failed to get container status \"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\": rpc error: code = NotFound desc = could not find container \"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\": container with ID starting with 61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.193747 4878 scope.go:117] "RemoveContainer" containerID="90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.197898 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\": container with ID starting with 90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076 not found: ID does not exist" containerID="90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.197961 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076"} err="failed to get container status \"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\": rpc error: code = NotFound desc = could not find container \"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\": container with ID starting with 90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.198005 4878 scope.go:117] "RemoveContainer" containerID="6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.199000 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\": container with ID starting with 6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e not found: ID does not exist" containerID="6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.199033 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e"} err="failed to get container status \"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\": rpc error: code = NotFound desc = could not find container \"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\": container with ID starting with 6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.199049 4878 scope.go:117] "RemoveContainer" containerID="f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.199665 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\": container with ID starting with f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6 not found: ID does not exist" containerID="f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.199724 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6"} err="failed to get container status \"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\": rpc error: code = NotFound desc = could not find container \"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\": container with ID starting with f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.199777 4878 scope.go:117] "RemoveContainer" containerID="cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.199840 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovnkube-script-lib\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.200434 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\": container with ID starting with cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1 not found: ID does not exist" containerID="cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.200479 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1"} err="failed to get container status \"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\": rpc error: code = NotFound desc = could not find container \"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\": container with ID starting with cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.200512 4878 scope.go:117] "RemoveContainer" containerID="eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.200560 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-env-overrides\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.200651 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovnkube-config\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.200844 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-ovn-node-metrics-cert\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.200935 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\": container with ID starting with eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024 not found: ID does not exist" containerID="eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.201034 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024"} err="failed to get container status \"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\": rpc error: code = NotFound desc = could not find container \"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\": container with ID starting with eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.201079 4878 scope.go:117] "RemoveContainer" containerID="25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5" Dec 02 18:26:05 crc kubenswrapper[4878]: E1202 18:26:05.207925 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\": container with ID starting with 25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5 not found: ID does not exist" containerID="25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.208084 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5"} err="failed to get container status \"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\": rpc error: code = NotFound desc = could not find container \"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\": container with ID starting with 25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.208211 4878 scope.go:117] "RemoveContainer" containerID="551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.208815 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c"} err="failed to get container status \"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c\": rpc error: code = NotFound desc = could not find container \"551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c\": container with ID starting with 551a445d977d16962b6d88d0bcf99fbf68903b4492ae7e4f5bc0ca2e1b27fb7c not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.208907 4878 scope.go:117] "RemoveContainer" containerID="c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.209228 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67"} err="failed to get container status \"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\": rpc error: code = NotFound desc = could not find container \"c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67\": container with ID starting with c2c781a93979cf1113fbd6aaf7b8d7635e6b7624b4141d9d257a1ce9fabf4e67 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.209401 4878 scope.go:117] "RemoveContainer" containerID="61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.209963 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d"} err="failed to get container status \"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\": rpc error: code = NotFound desc = could not find container \"61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d\": container with ID starting with 61c15236893ec2464b862b138c805aec0caac886f744182b7c90c994be60646d not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.210056 4878 scope.go:117] "RemoveContainer" containerID="90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.210423 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076"} err="failed to get container status \"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\": rpc error: code = NotFound desc = could not find container \"90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076\": container with ID starting with 90c6c9e1dcc2bbb14aed90b995dbc27ceeda911f54329fd6407480c1bddca076 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.210528 4878 scope.go:117] "RemoveContainer" containerID="6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.210883 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e"} err="failed to get container status \"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\": rpc error: code = NotFound desc = could not find container \"6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e\": container with ID starting with 6a2d03fbb5c9c7c357c8ad92b8ff3a3f2ee06ac6ea1bd68489060b5164551e7e not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.210987 4878 scope.go:117] "RemoveContainer" containerID="f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.211269 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6"} err="failed to get container status \"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\": rpc error: code = NotFound desc = could not find container \"f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6\": container with ID starting with f94e5b566c6a1afde8e1c3873f95ea6a4ce7bce26925bbc629ddc00a411739c6 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.211404 4878 scope.go:117] "RemoveContainer" containerID="cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.211849 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1"} err="failed to get container status \"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\": rpc error: code = NotFound desc = could not find container \"cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1\": container with ID starting with cf0b6fcb5ebc00d1f71c4f43ea105cffdf0b6d3c676f9bf3b85d7496dd88eca1 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.211933 4878 scope.go:117] "RemoveContainer" containerID="eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.212229 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024"} err="failed to get container status \"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\": rpc error: code = NotFound desc = could not find container \"eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024\": container with ID starting with eaceed70c3af1af7fd4afb7660715a56e6c8a7ea90bb160ae324ebf0b672d024 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.212274 4878 scope.go:117] "RemoveContainer" containerID="25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.212519 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5"} err="failed to get container status \"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\": rpc error: code = NotFound desc = could not find container \"25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5\": container with ID starting with 25643a731df4bc3f0184749a9e898a6e7ebb98b1cd30721b365e1a4b879dceb5 not found: ID does not exist" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.272965 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkfxm\" (UniqueName: \"kubernetes.io/projected/c4d3a7ce-356b-45b3-8d92-72b71b0f1362-kube-api-access-gkfxm\") pod \"ovnkube-node-dwtg4\" (UID: \"c4d3a7ce-356b-45b3-8d92-72b71b0f1362\") " pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.292466 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5jzn"] Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.303790 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.308326 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5jzn"] Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.840607 4878 generic.go:334] "Generic (PLEG): container finished" podID="c4d3a7ce-356b-45b3-8d92-72b71b0f1362" containerID="0882c359bed4f221b21e1905b0842fe9b3ae69afdd63e49df6664ebfef5bd20b" exitCode=0 Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.840706 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerDied","Data":"0882c359bed4f221b21e1905b0842fe9b3ae69afdd63e49df6664ebfef5bd20b"} Dec 02 18:26:05 crc kubenswrapper[4878]: I1202 18:26:05.841513 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"10232818e082cfb1d84d0046990684d4f69b067d6b8c7738aa98691c6a5fa1a8"} Dec 02 18:26:06 crc kubenswrapper[4878]: I1202 18:26:06.853023 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"b480b518ab984c4725dc74b19bec3a020c309d71cbe23b9533cec7b86664b4c1"} Dec 02 18:26:06 crc kubenswrapper[4878]: I1202 18:26:06.853834 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"88ab497e0922f1d3334301650c319423ec5c592d665cb7d5c66b6d29e46fcb6b"} Dec 02 18:26:06 crc kubenswrapper[4878]: I1202 18:26:06.853846 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"532326837b2082299946e181c5cd4283b5155c25228175b43af08fa2029d5831"} Dec 02 18:26:06 crc kubenswrapper[4878]: I1202 18:26:06.853856 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"3614ce9530a16368af7d17004ffce434b9d2b3b592264d6d6ee162c65a805a25"} Dec 02 18:26:06 crc kubenswrapper[4878]: I1202 18:26:06.853864 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"e7e51c9d8f46c6dbed4ed0a76c7dd01ea6155ca4d8e3f5789e8220ce45c68d7a"} Dec 02 18:26:06 crc kubenswrapper[4878]: I1202 18:26:06.946767 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d160cfa4-9e2a-429d-b760-0cac6d467b9a" path="/var/lib/kubelet/pods/d160cfa4-9e2a-429d-b760-0cac6d467b9a/volumes" Dec 02 18:26:07 crc kubenswrapper[4878]: I1202 18:26:07.873669 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"959eeea03f50c90836e954cf4346f2247453ee766d9dcd307917b46d23312d79"} Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.900730 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"14826aada3ef764b211d1c3779aaab75990a0a8002ae74abdbeb5f44801b5eb8"} Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.913036 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7"] Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.914113 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.915842 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-94k7t" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.918063 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.919725 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.970097 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gln\" (UniqueName: \"kubernetes.io/projected/302b5d81-5163-4052-a986-6fbdda49e9cf-kube-api-access-b6gln\") pod \"obo-prometheus-operator-668cf9dfbb-ffbv7\" (UID: \"302b5d81-5163-4052-a986-6fbdda49e9cf\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.974746 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl"] Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.975565 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.978403 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.979588 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-28plx" Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.996802 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx"] Dec 02 18:26:09 crc kubenswrapper[4878]: I1202 18:26:09.997878 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.071516 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/baa1d367-077f-4aa3-8dca-5a56cff08838-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx\" (UID: \"baa1d367-077f-4aa3-8dca-5a56cff08838\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.072126 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95fa4f57-a446-402b-9de4-5ff0d8109802-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl\" (UID: \"95fa4f57-a446-402b-9de4-5ff0d8109802\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.072185 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gln\" (UniqueName: \"kubernetes.io/projected/302b5d81-5163-4052-a986-6fbdda49e9cf-kube-api-access-b6gln\") pod \"obo-prometheus-operator-668cf9dfbb-ffbv7\" (UID: \"302b5d81-5163-4052-a986-6fbdda49e9cf\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.072231 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/baa1d367-077f-4aa3-8dca-5a56cff08838-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx\" (UID: \"baa1d367-077f-4aa3-8dca-5a56cff08838\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.072273 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95fa4f57-a446-402b-9de4-5ff0d8109802-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl\" (UID: \"95fa4f57-a446-402b-9de4-5ff0d8109802\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.115866 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gln\" (UniqueName: \"kubernetes.io/projected/302b5d81-5163-4052-a986-6fbdda49e9cf-kube-api-access-b6gln\") pod \"obo-prometheus-operator-668cf9dfbb-ffbv7\" (UID: \"302b5d81-5163-4052-a986-6fbdda49e9cf\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.165133 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-ctrvg"] Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.167189 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.169947 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.170159 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-mhnvn" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.177291 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95fa4f57-a446-402b-9de4-5ff0d8109802-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl\" (UID: \"95fa4f57-a446-402b-9de4-5ff0d8109802\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.177362 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/baa1d367-077f-4aa3-8dca-5a56cff08838-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx\" (UID: \"baa1d367-077f-4aa3-8dca-5a56cff08838\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.177400 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95fa4f57-a446-402b-9de4-5ff0d8109802-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl\" (UID: \"95fa4f57-a446-402b-9de4-5ff0d8109802\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.177479 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/baa1d367-077f-4aa3-8dca-5a56cff08838-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx\" (UID: \"baa1d367-077f-4aa3-8dca-5a56cff08838\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.184871 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95fa4f57-a446-402b-9de4-5ff0d8109802-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl\" (UID: \"95fa4f57-a446-402b-9de4-5ff0d8109802\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.184915 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/baa1d367-077f-4aa3-8dca-5a56cff08838-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx\" (UID: \"baa1d367-077f-4aa3-8dca-5a56cff08838\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.186432 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/baa1d367-077f-4aa3-8dca-5a56cff08838-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx\" (UID: \"baa1d367-077f-4aa3-8dca-5a56cff08838\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.193803 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95fa4f57-a446-402b-9de4-5ff0d8109802-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl\" (UID: \"95fa4f57-a446-402b-9de4-5ff0d8109802\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.232128 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.267126 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(ed000410b16cdbc189b2b0b5fb16fc1790b53e2a203905ab1e1df2ac1129e3e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.267203 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(ed000410b16cdbc189b2b0b5fb16fc1790b53e2a203905ab1e1df2ac1129e3e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.267254 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(ed000410b16cdbc189b2b0b5fb16fc1790b53e2a203905ab1e1df2ac1129e3e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.267306 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators(302b5d81-5163-4052-a986-6fbdda49e9cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators(302b5d81-5163-4052-a986-6fbdda49e9cf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(ed000410b16cdbc189b2b0b5fb16fc1790b53e2a203905ab1e1df2ac1129e3e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" podUID="302b5d81-5163-4052-a986-6fbdda49e9cf" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.278678 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bjg\" (UniqueName: \"kubernetes.io/projected/fee86c2f-ee2c-49c4-a96c-f59e7ef28524-kube-api-access-96bjg\") pod \"observability-operator-d8bb48f5d-ctrvg\" (UID: \"fee86c2f-ee2c-49c4-a96c-f59e7ef28524\") " pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.278739 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/fee86c2f-ee2c-49c4-a96c-f59e7ef28524-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-ctrvg\" (UID: \"fee86c2f-ee2c-49c4-a96c-f59e7ef28524\") " pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.293319 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.322493 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.323754 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(c538eb967e5fc51ad1af9eb49811ebfec0110e4cc8b31bc601b40640bfac81be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.323830 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(c538eb967e5fc51ad1af9eb49811ebfec0110e4cc8b31bc601b40640bfac81be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.323862 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(c538eb967e5fc51ad1af9eb49811ebfec0110e4cc8b31bc601b40640bfac81be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.323924 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators(95fa4f57-a446-402b-9de4-5ff0d8109802)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators(95fa4f57-a446-402b-9de4-5ff0d8109802)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(c538eb967e5fc51ad1af9eb49811ebfec0110e4cc8b31bc601b40640bfac81be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" podUID="95fa4f57-a446-402b-9de4-5ff0d8109802" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.341747 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-s7xc9"] Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.342952 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.345076 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-mjtw6" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.353135 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(4f6ba43cf8c623aaec695261a17b6c7d92ee0f21666bdba03a7748859e1b7367): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.353347 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(4f6ba43cf8c623aaec695261a17b6c7d92ee0f21666bdba03a7748859e1b7367): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.353468 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(4f6ba43cf8c623aaec695261a17b6c7d92ee0f21666bdba03a7748859e1b7367): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.353610 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators(baa1d367-077f-4aa3-8dca-5a56cff08838)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators(baa1d367-077f-4aa3-8dca-5a56cff08838)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(4f6ba43cf8c623aaec695261a17b6c7d92ee0f21666bdba03a7748859e1b7367): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" podUID="baa1d367-077f-4aa3-8dca-5a56cff08838" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.380394 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96bjg\" (UniqueName: \"kubernetes.io/projected/fee86c2f-ee2c-49c4-a96c-f59e7ef28524-kube-api-access-96bjg\") pod \"observability-operator-d8bb48f5d-ctrvg\" (UID: \"fee86c2f-ee2c-49c4-a96c-f59e7ef28524\") " pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.380462 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/fee86c2f-ee2c-49c4-a96c-f59e7ef28524-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-ctrvg\" (UID: \"fee86c2f-ee2c-49c4-a96c-f59e7ef28524\") " pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.385356 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/fee86c2f-ee2c-49c4-a96c-f59e7ef28524-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-ctrvg\" (UID: \"fee86c2f-ee2c-49c4-a96c-f59e7ef28524\") " pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.416123 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bjg\" (UniqueName: \"kubernetes.io/projected/fee86c2f-ee2c-49c4-a96c-f59e7ef28524-kube-api-access-96bjg\") pod \"observability-operator-d8bb48f5d-ctrvg\" (UID: \"fee86c2f-ee2c-49c4-a96c-f59e7ef28524\") " pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.481978 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/161358a3-71af-4def-b6a0-0ba9b5f2a7b3-openshift-service-ca\") pod \"perses-operator-5446b9c989-s7xc9\" (UID: \"161358a3-71af-4def-b6a0-0ba9b5f2a7b3\") " pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.482107 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8fv5\" (UniqueName: \"kubernetes.io/projected/161358a3-71af-4def-b6a0-0ba9b5f2a7b3-kube-api-access-c8fv5\") pod \"perses-operator-5446b9c989-s7xc9\" (UID: \"161358a3-71af-4def-b6a0-0ba9b5f2a7b3\") " pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.536764 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.565875 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(3f334e62bdb04dba8f8b5b10ec3909c1ff17fde81373e549f2df5fda67693c64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.565967 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(3f334e62bdb04dba8f8b5b10ec3909c1ff17fde81373e549f2df5fda67693c64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.565998 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(3f334e62bdb04dba8f8b5b10ec3909c1ff17fde81373e549f2df5fda67693c64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.566058 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-ctrvg_openshift-operators(fee86c2f-ee2c-49c4-a96c-f59e7ef28524)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-ctrvg_openshift-operators(fee86c2f-ee2c-49c4-a96c-f59e7ef28524)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(3f334e62bdb04dba8f8b5b10ec3909c1ff17fde81373e549f2df5fda67693c64): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" podUID="fee86c2f-ee2c-49c4-a96c-f59e7ef28524" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.583204 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8fv5\" (UniqueName: \"kubernetes.io/projected/161358a3-71af-4def-b6a0-0ba9b5f2a7b3-kube-api-access-c8fv5\") pod \"perses-operator-5446b9c989-s7xc9\" (UID: \"161358a3-71af-4def-b6a0-0ba9b5f2a7b3\") " pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.583611 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/161358a3-71af-4def-b6a0-0ba9b5f2a7b3-openshift-service-ca\") pod \"perses-operator-5446b9c989-s7xc9\" (UID: \"161358a3-71af-4def-b6a0-0ba9b5f2a7b3\") " pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.584888 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/161358a3-71af-4def-b6a0-0ba9b5f2a7b3-openshift-service-ca\") pod \"perses-operator-5446b9c989-s7xc9\" (UID: \"161358a3-71af-4def-b6a0-0ba9b5f2a7b3\") " pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.607924 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8fv5\" (UniqueName: \"kubernetes.io/projected/161358a3-71af-4def-b6a0-0ba9b5f2a7b3-kube-api-access-c8fv5\") pod \"perses-operator-5446b9c989-s7xc9\" (UID: \"161358a3-71af-4def-b6a0-0ba9b5f2a7b3\") " pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: I1202 18:26:10.664664 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.689659 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(692a3dd1b97310d44a1ebf58bcae193c28e7ff29488d3f70b8c893c4e707825d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.689733 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(692a3dd1b97310d44a1ebf58bcae193c28e7ff29488d3f70b8c893c4e707825d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.689763 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(692a3dd1b97310d44a1ebf58bcae193c28e7ff29488d3f70b8c893c4e707825d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:10 crc kubenswrapper[4878]: E1202 18:26:10.689822 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-s7xc9_openshift-operators(161358a3-71af-4def-b6a0-0ba9b5f2a7b3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-s7xc9_openshift-operators(161358a3-71af-4def-b6a0-0ba9b5f2a7b3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(692a3dd1b97310d44a1ebf58bcae193c28e7ff29488d3f70b8c893c4e707825d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" podUID="161358a3-71af-4def-b6a0-0ba9b5f2a7b3" Dec 02 18:26:11 crc kubenswrapper[4878]: I1202 18:26:11.925656 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" event={"ID":"c4d3a7ce-356b-45b3-8d92-72b71b0f1362","Type":"ContainerStarted","Data":"ba3663efd42fbcf467a5c170d01652f181ba0baee76ff47f7e208312c00a5062"} Dec 02 18:26:11 crc kubenswrapper[4878]: I1202 18:26:11.926364 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:11 crc kubenswrapper[4878]: I1202 18:26:11.926383 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:11 crc kubenswrapper[4878]: I1202 18:26:11.926395 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.004703 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.031701 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.113772 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" podStartSLOduration=8.113753553 podStartE2EDuration="8.113753553s" podCreationTimestamp="2025-12-02 18:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:26:12.019699759 +0000 UTC m=+681.709318640" watchObservedRunningTime="2025-12-02 18:26:12.113753553 +0000 UTC m=+681.803372434" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.579306 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-ctrvg"] Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.579503 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.580228 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.584774 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7"] Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.585024 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.585768 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.590566 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx"] Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.590743 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.591450 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.609204 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-s7xc9"] Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.609614 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.610467 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.621874 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl"] Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.622046 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:12 crc kubenswrapper[4878]: I1202 18:26:12.622655 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.650476 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(6e69b996ea7434a26d7673b07ff3e79ec404bffaee648595f404078e03e1fbff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.650567 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(6e69b996ea7434a26d7673b07ff3e79ec404bffaee648595f404078e03e1fbff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.650605 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(6e69b996ea7434a26d7673b07ff3e79ec404bffaee648595f404078e03e1fbff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.650675 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-ctrvg_openshift-operators(fee86c2f-ee2c-49c4-a96c-f59e7ef28524)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-ctrvg_openshift-operators(fee86c2f-ee2c-49c4-a96c-f59e7ef28524)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(6e69b996ea7434a26d7673b07ff3e79ec404bffaee648595f404078e03e1fbff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" podUID="fee86c2f-ee2c-49c4-a96c-f59e7ef28524" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.675671 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(7736bc3e6dc1e1a9ce2c5ad50b93649ddaf0ef87ce5ad3c1276f3b4afb11d9e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.675738 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(7736bc3e6dc1e1a9ce2c5ad50b93649ddaf0ef87ce5ad3c1276f3b4afb11d9e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.675761 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(7736bc3e6dc1e1a9ce2c5ad50b93649ddaf0ef87ce5ad3c1276f3b4afb11d9e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.675804 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators(302b5d81-5163-4052-a986-6fbdda49e9cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators(302b5d81-5163-4052-a986-6fbdda49e9cf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(7736bc3e6dc1e1a9ce2c5ad50b93649ddaf0ef87ce5ad3c1276f3b4afb11d9e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" podUID="302b5d81-5163-4052-a986-6fbdda49e9cf" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.690330 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(796c5c2e1e96ce14ce388c82a50a179f4147031b4351e80899e51371c83e7662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.690440 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(796c5c2e1e96ce14ce388c82a50a179f4147031b4351e80899e51371c83e7662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.690473 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(796c5c2e1e96ce14ce388c82a50a179f4147031b4351e80899e51371c83e7662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.690546 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators(baa1d367-077f-4aa3-8dca-5a56cff08838)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators(baa1d367-077f-4aa3-8dca-5a56cff08838)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(796c5c2e1e96ce14ce388c82a50a179f4147031b4351e80899e51371c83e7662): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" podUID="baa1d367-077f-4aa3-8dca-5a56cff08838" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.698509 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(448e47e9fd3d32998132cbf191195494b7562ce4e33014060a01b115bb75488a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.698594 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(448e47e9fd3d32998132cbf191195494b7562ce4e33014060a01b115bb75488a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.698620 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(448e47e9fd3d32998132cbf191195494b7562ce4e33014060a01b115bb75488a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.698682 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators(95fa4f57-a446-402b-9de4-5ff0d8109802)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators(95fa4f57-a446-402b-9de4-5ff0d8109802)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(448e47e9fd3d32998132cbf191195494b7562ce4e33014060a01b115bb75488a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" podUID="95fa4f57-a446-402b-9de4-5ff0d8109802" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.702814 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(f03985280f301b97d4894f1f2cdf15720a2dab6a0b9fd61e08d382a747277922): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.702855 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(f03985280f301b97d4894f1f2cdf15720a2dab6a0b9fd61e08d382a747277922): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.702873 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(f03985280f301b97d4894f1f2cdf15720a2dab6a0b9fd61e08d382a747277922): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:12 crc kubenswrapper[4878]: E1202 18:26:12.702907 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-s7xc9_openshift-operators(161358a3-71af-4def-b6a0-0ba9b5f2a7b3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-s7xc9_openshift-operators(161358a3-71af-4def-b6a0-0ba9b5f2a7b3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(f03985280f301b97d4894f1f2cdf15720a2dab6a0b9fd61e08d382a747277922): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" podUID="161358a3-71af-4def-b6a0-0ba9b5f2a7b3" Dec 02 18:26:16 crc kubenswrapper[4878]: I1202 18:26:16.938081 4878 scope.go:117] "RemoveContainer" containerID="8d8e064c8177248bf254025158f61f6dfa81e6a00b21ef6624c736c7a6a8fdaf" Dec 02 18:26:16 crc kubenswrapper[4878]: E1202 18:26:16.939179 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6cm9t_openshift-multus(e79a8cec-20ba-4862-ba25-7de014466668)\"" pod="openshift-multus/multus-6cm9t" podUID="e79a8cec-20ba-4862-ba25-7de014466668" Dec 02 18:26:23 crc kubenswrapper[4878]: I1202 18:26:23.937821 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:23 crc kubenswrapper[4878]: I1202 18:26:23.937931 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:23 crc kubenswrapper[4878]: I1202 18:26:23.939450 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:23 crc kubenswrapper[4878]: I1202 18:26:23.939465 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:23 crc kubenswrapper[4878]: E1202 18:26:23.989889 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(66edb4144a183b847375c10f967c2293b25e185478522cb674981cd65ca8f557): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:23 crc kubenswrapper[4878]: E1202 18:26:23.989962 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(66edb4144a183b847375c10f967c2293b25e185478522cb674981cd65ca8f557): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:23 crc kubenswrapper[4878]: E1202 18:26:23.990003 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(66edb4144a183b847375c10f967c2293b25e185478522cb674981cd65ca8f557): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:23 crc kubenswrapper[4878]: E1202 18:26:23.990060 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators(baa1d367-077f-4aa3-8dca-5a56cff08838)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators(baa1d367-077f-4aa3-8dca-5a56cff08838)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_openshift-operators_baa1d367-077f-4aa3-8dca-5a56cff08838_0(66edb4144a183b847375c10f967c2293b25e185478522cb674981cd65ca8f557): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" podUID="baa1d367-077f-4aa3-8dca-5a56cff08838" Dec 02 18:26:24 crc kubenswrapper[4878]: E1202 18:26:24.008041 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(13191b9c3b02188d6fb5ba6737c29be24f84b64bc0615471df38d7bc48b1eacb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:24 crc kubenswrapper[4878]: E1202 18:26:24.008122 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(13191b9c3b02188d6fb5ba6737c29be24f84b64bc0615471df38d7bc48b1eacb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:24 crc kubenswrapper[4878]: E1202 18:26:24.008149 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(13191b9c3b02188d6fb5ba6737c29be24f84b64bc0615471df38d7bc48b1eacb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:24 crc kubenswrapper[4878]: E1202 18:26:24.008227 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators(95fa4f57-a446-402b-9de4-5ff0d8109802)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators(95fa4f57-a446-402b-9de4-5ff0d8109802)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_openshift-operators_95fa4f57-a446-402b-9de4-5ff0d8109802_0(13191b9c3b02188d6fb5ba6737c29be24f84b64bc0615471df38d7bc48b1eacb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" podUID="95fa4f57-a446-402b-9de4-5ff0d8109802" Dec 02 18:26:25 crc kubenswrapper[4878]: I1202 18:26:25.937276 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:25 crc kubenswrapper[4878]: I1202 18:26:25.938621 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:25 crc kubenswrapper[4878]: E1202 18:26:25.968218 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(2e7453c9d599b38d587f1a80e00ee7ada5025393086fbb5d07bee6450729383d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:25 crc kubenswrapper[4878]: E1202 18:26:25.968348 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(2e7453c9d599b38d587f1a80e00ee7ada5025393086fbb5d07bee6450729383d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:25 crc kubenswrapper[4878]: E1202 18:26:25.968377 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(2e7453c9d599b38d587f1a80e00ee7ada5025393086fbb5d07bee6450729383d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:25 crc kubenswrapper[4878]: E1202 18:26:25.968446 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-s7xc9_openshift-operators(161358a3-71af-4def-b6a0-0ba9b5f2a7b3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-s7xc9_openshift-operators(161358a3-71af-4def-b6a0-0ba9b5f2a7b3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-s7xc9_openshift-operators_161358a3-71af-4def-b6a0-0ba9b5f2a7b3_0(2e7453c9d599b38d587f1a80e00ee7ada5025393086fbb5d07bee6450729383d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" podUID="161358a3-71af-4def-b6a0-0ba9b5f2a7b3" Dec 02 18:26:26 crc kubenswrapper[4878]: I1202 18:26:26.937814 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:26 crc kubenswrapper[4878]: I1202 18:26:26.938576 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:26 crc kubenswrapper[4878]: E1202 18:26:26.973112 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(db7c752248567bf662d5a84dd1375643bd3d2731c176717f35f1a713bfe93115): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:26 crc kubenswrapper[4878]: E1202 18:26:26.973209 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(db7c752248567bf662d5a84dd1375643bd3d2731c176717f35f1a713bfe93115): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:26 crc kubenswrapper[4878]: E1202 18:26:26.973254 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(db7c752248567bf662d5a84dd1375643bd3d2731c176717f35f1a713bfe93115): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:26 crc kubenswrapper[4878]: E1202 18:26:26.973325 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators(302b5d81-5163-4052-a986-6fbdda49e9cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators(302b5d81-5163-4052-a986-6fbdda49e9cf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-ffbv7_openshift-operators_302b5d81-5163-4052-a986-6fbdda49e9cf_0(db7c752248567bf662d5a84dd1375643bd3d2731c176717f35f1a713bfe93115): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" podUID="302b5d81-5163-4052-a986-6fbdda49e9cf" Dec 02 18:26:27 crc kubenswrapper[4878]: I1202 18:26:27.937281 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:27 crc kubenswrapper[4878]: I1202 18:26:27.938877 4878 scope.go:117] "RemoveContainer" containerID="8d8e064c8177248bf254025158f61f6dfa81e6a00b21ef6624c736c7a6a8fdaf" Dec 02 18:26:27 crc kubenswrapper[4878]: I1202 18:26:27.939184 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:27 crc kubenswrapper[4878]: E1202 18:26:27.977545 4878 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(eb82051a4e2676419e65a1e957d4b43e17448e1690cfbbd91458bae810954b5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 18:26:27 crc kubenswrapper[4878]: E1202 18:26:27.977661 4878 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(eb82051a4e2676419e65a1e957d4b43e17448e1690cfbbd91458bae810954b5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:27 crc kubenswrapper[4878]: E1202 18:26:27.977693 4878 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(eb82051a4e2676419e65a1e957d4b43e17448e1690cfbbd91458bae810954b5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:27 crc kubenswrapper[4878]: E1202 18:26:27.977759 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-ctrvg_openshift-operators(fee86c2f-ee2c-49c4-a96c-f59e7ef28524)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-ctrvg_openshift-operators(fee86c2f-ee2c-49c4-a96c-f59e7ef28524)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-ctrvg_openshift-operators_fee86c2f-ee2c-49c4-a96c-f59e7ef28524_0(eb82051a4e2676419e65a1e957d4b43e17448e1690cfbbd91458bae810954b5f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" podUID="fee86c2f-ee2c-49c4-a96c-f59e7ef28524" Dec 02 18:26:29 crc kubenswrapper[4878]: I1202 18:26:29.041908 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6cm9t_e79a8cec-20ba-4862-ba25-7de014466668/kube-multus/2.log" Dec 02 18:26:29 crc kubenswrapper[4878]: I1202 18:26:29.042383 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6cm9t" event={"ID":"e79a8cec-20ba-4862-ba25-7de014466668","Type":"ContainerStarted","Data":"2ce687151051bca1741488b43623868370b4016692c57f60408cba5e9879ea41"} Dec 02 18:26:34 crc kubenswrapper[4878]: I1202 18:26:34.937332 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:34 crc kubenswrapper[4878]: I1202 18:26:34.938492 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" Dec 02 18:26:35 crc kubenswrapper[4878]: I1202 18:26:35.181804 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl"] Dec 02 18:26:35 crc kubenswrapper[4878]: I1202 18:26:35.341626 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dwtg4" Dec 02 18:26:36 crc kubenswrapper[4878]: I1202 18:26:36.098387 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" event={"ID":"95fa4f57-a446-402b-9de4-5ff0d8109802","Type":"ContainerStarted","Data":"24950ed7e7df250d5c5ae7fffe1ab2072d645fea2f8084c102517acc98b9d11a"} Dec 02 18:26:36 crc kubenswrapper[4878]: I1202 18:26:36.937098 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:36 crc kubenswrapper[4878]: I1202 18:26:36.938469 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" Dec 02 18:26:37 crc kubenswrapper[4878]: I1202 18:26:37.347289 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx"] Dec 02 18:26:37 crc kubenswrapper[4878]: W1202 18:26:37.361044 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa1d367_077f_4aa3_8dca_5a56cff08838.slice/crio-0a25834ef3b72e6fe4ae56fd70f69db9cb9da605b038d7cd3fc5e50765c2c4e8 WatchSource:0}: Error finding container 0a25834ef3b72e6fe4ae56fd70f69db9cb9da605b038d7cd3fc5e50765c2c4e8: Status 404 returned error can't find the container with id 0a25834ef3b72e6fe4ae56fd70f69db9cb9da605b038d7cd3fc5e50765c2c4e8 Dec 02 18:26:37 crc kubenswrapper[4878]: I1202 18:26:37.936850 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:37 crc kubenswrapper[4878]: I1202 18:26:37.937469 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:38 crc kubenswrapper[4878]: I1202 18:26:38.122515 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" event={"ID":"baa1d367-077f-4aa3-8dca-5a56cff08838","Type":"ContainerStarted","Data":"0a25834ef3b72e6fe4ae56fd70f69db9cb9da605b038d7cd3fc5e50765c2c4e8"} Dec 02 18:26:38 crc kubenswrapper[4878]: I1202 18:26:38.165007 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-s7xc9"] Dec 02 18:26:38 crc kubenswrapper[4878]: W1202 18:26:38.167372 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161358a3_71af_4def_b6a0_0ba9b5f2a7b3.slice/crio-bdf021b531ab71e535844b1e8b47d2e4908bc32694230552f8b2ce785caa12d9 WatchSource:0}: Error finding container bdf021b531ab71e535844b1e8b47d2e4908bc32694230552f8b2ce785caa12d9: Status 404 returned error can't find the container with id bdf021b531ab71e535844b1e8b47d2e4908bc32694230552f8b2ce785caa12d9 Dec 02 18:26:38 crc kubenswrapper[4878]: I1202 18:26:38.942686 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:38 crc kubenswrapper[4878]: I1202 18:26:38.943176 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:39 crc kubenswrapper[4878]: I1202 18:26:39.141571 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" event={"ID":"161358a3-71af-4def-b6a0-0ba9b5f2a7b3","Type":"ContainerStarted","Data":"bdf021b531ab71e535844b1e8b47d2e4908bc32694230552f8b2ce785caa12d9"} Dec 02 18:26:39 crc kubenswrapper[4878]: I1202 18:26:39.195936 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-ctrvg"] Dec 02 18:26:41 crc kubenswrapper[4878]: I1202 18:26:41.936936 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:41 crc kubenswrapper[4878]: I1202 18:26:41.938124 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" Dec 02 18:26:42 crc kubenswrapper[4878]: W1202 18:26:42.047700 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee86c2f_ee2c_49c4_a96c_f59e7ef28524.slice/crio-4c1f35a87845231d1f19b64b2de5178ad1e57fa398b4732912d169e809dc76bb WatchSource:0}: Error finding container 4c1f35a87845231d1f19b64b2de5178ad1e57fa398b4732912d169e809dc76bb: Status 404 returned error can't find the container with id 4c1f35a87845231d1f19b64b2de5178ad1e57fa398b4732912d169e809dc76bb Dec 02 18:26:42 crc kubenswrapper[4878]: I1202 18:26:42.167367 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" event={"ID":"fee86c2f-ee2c-49c4-a96c-f59e7ef28524","Type":"ContainerStarted","Data":"4c1f35a87845231d1f19b64b2de5178ad1e57fa398b4732912d169e809dc76bb"} Dec 02 18:26:43 crc kubenswrapper[4878]: I1202 18:26:43.082451 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7"] Dec 02 18:26:43 crc kubenswrapper[4878]: I1202 18:26:43.176316 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" event={"ID":"95fa4f57-a446-402b-9de4-5ff0d8109802","Type":"ContainerStarted","Data":"6c39eaeaebb9b467133a913b48313c7d3691683c8e5b6eb532b5e548ff20b84a"} Dec 02 18:26:43 crc kubenswrapper[4878]: I1202 18:26:43.179500 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" event={"ID":"baa1d367-077f-4aa3-8dca-5a56cff08838","Type":"ContainerStarted","Data":"fbb95ffff62b9ce7c78c0d91aa485ff6cb560eee97b4d00e0fdc7dc18e743b15"} Dec 02 18:26:43 crc kubenswrapper[4878]: I1202 18:26:43.207924 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl" podStartSLOduration=26.676014177 podStartE2EDuration="34.207893994s" podCreationTimestamp="2025-12-02 18:26:09 +0000 UTC" firstStartedPulling="2025-12-02 18:26:35.199588936 +0000 UTC m=+704.889207817" lastFinishedPulling="2025-12-02 18:26:42.731468753 +0000 UTC m=+712.421087634" observedRunningTime="2025-12-02 18:26:43.195996652 +0000 UTC m=+712.885615543" watchObservedRunningTime="2025-12-02 18:26:43.207893994 +0000 UTC m=+712.897512875" Dec 02 18:26:43 crc kubenswrapper[4878]: I1202 18:26:43.224943 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-55bb47f485-749dx" podStartSLOduration=28.831414318 podStartE2EDuration="34.224925387s" podCreationTimestamp="2025-12-02 18:26:09 +0000 UTC" firstStartedPulling="2025-12-02 18:26:37.366310261 +0000 UTC m=+707.055929142" lastFinishedPulling="2025-12-02 18:26:42.75982133 +0000 UTC m=+712.449440211" observedRunningTime="2025-12-02 18:26:43.224088941 +0000 UTC m=+712.913707822" watchObservedRunningTime="2025-12-02 18:26:43.224925387 +0000 UTC m=+712.914544268" Dec 02 18:26:43 crc kubenswrapper[4878]: W1202 18:26:43.767831 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302b5d81_5163_4052_a986_6fbdda49e9cf.slice/crio-8a578fe55dca296401dd21bb64053b1c5a237c4643f6359068f9aa1bce3fe2d6 WatchSource:0}: Error finding container 8a578fe55dca296401dd21bb64053b1c5a237c4643f6359068f9aa1bce3fe2d6: Status 404 returned error can't find the container with id 8a578fe55dca296401dd21bb64053b1c5a237c4643f6359068f9aa1bce3fe2d6 Dec 02 18:26:44 crc kubenswrapper[4878]: I1202 18:26:44.205525 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" event={"ID":"161358a3-71af-4def-b6a0-0ba9b5f2a7b3","Type":"ContainerStarted","Data":"15ae1159296375811df24e24137b46fd865901affb8e11996376f08ca300962f"} Dec 02 18:26:44 crc kubenswrapper[4878]: I1202 18:26:44.205999 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:44 crc kubenswrapper[4878]: I1202 18:26:44.208269 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" event={"ID":"302b5d81-5163-4052-a986-6fbdda49e9cf","Type":"ContainerStarted","Data":"8a578fe55dca296401dd21bb64053b1c5a237c4643f6359068f9aa1bce3fe2d6"} Dec 02 18:26:44 crc kubenswrapper[4878]: I1202 18:26:44.233902 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" podStartSLOduration=28.564504173 podStartE2EDuration="34.233862456s" podCreationTimestamp="2025-12-02 18:26:10 +0000 UTC" firstStartedPulling="2025-12-02 18:26:38.172017018 +0000 UTC m=+707.861635899" lastFinishedPulling="2025-12-02 18:26:43.841375301 +0000 UTC m=+713.530994182" observedRunningTime="2025-12-02 18:26:44.223660277 +0000 UTC m=+713.913279198" watchObservedRunningTime="2025-12-02 18:26:44.233862456 +0000 UTC m=+713.923481337" Dec 02 18:26:48 crc kubenswrapper[4878]: I1202 18:26:48.237630 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" event={"ID":"302b5d81-5163-4052-a986-6fbdda49e9cf","Type":"ContainerStarted","Data":"79e04f9b57d525d833137fa3ca4db374bc5a5c5cd1574f1531e53f31481675a0"} Dec 02 18:26:48 crc kubenswrapper[4878]: I1202 18:26:48.241541 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" event={"ID":"fee86c2f-ee2c-49c4-a96c-f59e7ef28524","Type":"ContainerStarted","Data":"f3ab1fe6d910c0b42c78ebfd082bf5912da2036db4b60028f3deb6a04baca07e"} Dec 02 18:26:48 crc kubenswrapper[4878]: I1202 18:26:48.241856 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:48 crc kubenswrapper[4878]: I1202 18:26:48.243392 4878 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-ctrvg container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.7:8081/healthz\": dial tcp 10.217.0.7:8081: connect: connection refused" start-of-body= Dec 02 18:26:48 crc kubenswrapper[4878]: I1202 18:26:48.243452 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" podUID="fee86c2f-ee2c-49c4-a96c-f59e7ef28524" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.7:8081/healthz\": dial tcp 10.217.0.7:8081: connect: connection refused" Dec 02 18:26:48 crc kubenswrapper[4878]: I1202 18:26:48.271294 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-ffbv7" podStartSLOduration=35.201889865 podStartE2EDuration="39.271263001s" podCreationTimestamp="2025-12-02 18:26:09 +0000 UTC" firstStartedPulling="2025-12-02 18:26:43.772109713 +0000 UTC m=+713.461728594" lastFinishedPulling="2025-12-02 18:26:47.841482849 +0000 UTC m=+717.531101730" observedRunningTime="2025-12-02 18:26:48.261279378 +0000 UTC m=+717.950898279" watchObservedRunningTime="2025-12-02 18:26:48.271263001 +0000 UTC m=+717.960881882" Dec 02 18:26:48 crc kubenswrapper[4878]: I1202 18:26:48.290101 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" podStartSLOduration=32.479683812 podStartE2EDuration="38.290071359s" podCreationTimestamp="2025-12-02 18:26:10 +0000 UTC" firstStartedPulling="2025-12-02 18:26:42.052461251 +0000 UTC m=+711.742080142" lastFinishedPulling="2025-12-02 18:26:47.862848808 +0000 UTC m=+717.552467689" observedRunningTime="2025-12-02 18:26:48.284856796 +0000 UTC m=+717.974475677" watchObservedRunningTime="2025-12-02 18:26:48.290071359 +0000 UTC m=+717.979690250" Dec 02 18:26:49 crc kubenswrapper[4878]: I1202 18:26:49.266737 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-ctrvg" Dec 02 18:26:50 crc kubenswrapper[4878]: I1202 18:26:50.667738 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-s7xc9" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.298096 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xhjz5"] Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.300084 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.302166 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.302909 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.303017 4878 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-gl6w6" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.335843 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xhjz5"] Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.367934 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7mgrc"] Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.369447 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7mgrc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.375361 4878 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fpsqv" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.376226 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fk5s\" (UniqueName: \"kubernetes.io/projected/c5196963-9d92-4f0a-ab8c-47f4b86a685f-kube-api-access-7fk5s\") pod \"cert-manager-cainjector-7f985d654d-xhjz5\" (UID: \"c5196963-9d92-4f0a-ab8c-47f4b86a685f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.386966 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7mgrc"] Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.418503 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qfhjc"] Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.419767 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.423172 4878 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-k46mj" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.433353 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qfhjc"] Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.477803 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fk5s\" (UniqueName: \"kubernetes.io/projected/c5196963-9d92-4f0a-ab8c-47f4b86a685f-kube-api-access-7fk5s\") pod \"cert-manager-cainjector-7f985d654d-xhjz5\" (UID: \"c5196963-9d92-4f0a-ab8c-47f4b86a685f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.477860 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqrbf\" (UniqueName: \"kubernetes.io/projected/bb76342f-0435-4589-826f-3a7cee8cc419-kube-api-access-dqrbf\") pod \"cert-manager-webhook-5655c58dd6-qfhjc\" (UID: \"bb76342f-0435-4589-826f-3a7cee8cc419\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.477901 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47csc\" (UniqueName: \"kubernetes.io/projected/4311fcfc-1cf4-4bab-b946-40efef5b8c10-kube-api-access-47csc\") pod \"cert-manager-5b446d88c5-7mgrc\" (UID: \"4311fcfc-1cf4-4bab-b946-40efef5b8c10\") " pod="cert-manager/cert-manager-5b446d88c5-7mgrc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.519248 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fk5s\" (UniqueName: \"kubernetes.io/projected/c5196963-9d92-4f0a-ab8c-47f4b86a685f-kube-api-access-7fk5s\") pod \"cert-manager-cainjector-7f985d654d-xhjz5\" (UID: \"c5196963-9d92-4f0a-ab8c-47f4b86a685f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.579783 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47csc\" (UniqueName: \"kubernetes.io/projected/4311fcfc-1cf4-4bab-b946-40efef5b8c10-kube-api-access-47csc\") pod \"cert-manager-5b446d88c5-7mgrc\" (UID: \"4311fcfc-1cf4-4bab-b946-40efef5b8c10\") " pod="cert-manager/cert-manager-5b446d88c5-7mgrc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.580339 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqrbf\" (UniqueName: \"kubernetes.io/projected/bb76342f-0435-4589-826f-3a7cee8cc419-kube-api-access-dqrbf\") pod \"cert-manager-webhook-5655c58dd6-qfhjc\" (UID: \"bb76342f-0435-4589-826f-3a7cee8cc419\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.608827 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47csc\" (UniqueName: \"kubernetes.io/projected/4311fcfc-1cf4-4bab-b946-40efef5b8c10-kube-api-access-47csc\") pod \"cert-manager-5b446d88c5-7mgrc\" (UID: \"4311fcfc-1cf4-4bab-b946-40efef5b8c10\") " pod="cert-manager/cert-manager-5b446d88c5-7mgrc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.615848 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqrbf\" (UniqueName: \"kubernetes.io/projected/bb76342f-0435-4589-826f-3a7cee8cc419-kube-api-access-dqrbf\") pod \"cert-manager-webhook-5655c58dd6-qfhjc\" (UID: \"bb76342f-0435-4589-826f-3a7cee8cc419\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.621454 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.707958 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7mgrc" Dec 02 18:26:55 crc kubenswrapper[4878]: I1202 18:26:55.741775 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" Dec 02 18:26:56 crc kubenswrapper[4878]: I1202 18:26:56.008886 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xhjz5"] Dec 02 18:26:56 crc kubenswrapper[4878]: I1202 18:26:56.285728 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7mgrc"] Dec 02 18:26:56 crc kubenswrapper[4878]: W1202 18:26:56.292906 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4311fcfc_1cf4_4bab_b946_40efef5b8c10.slice/crio-69ffabbab72306609831c9cf2a696aec0c70c1eed281798b31323523bf01f453 WatchSource:0}: Error finding container 69ffabbab72306609831c9cf2a696aec0c70c1eed281798b31323523bf01f453: Status 404 returned error can't find the container with id 69ffabbab72306609831c9cf2a696aec0c70c1eed281798b31323523bf01f453 Dec 02 18:26:56 crc kubenswrapper[4878]: I1202 18:26:56.305733 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-qfhjc"] Dec 02 18:26:56 crc kubenswrapper[4878]: W1202 18:26:56.310454 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb76342f_0435_4589_826f_3a7cee8cc419.slice/crio-ccb13232202aa808532d1d316ec8e359abbf99cf0e5348096e3f89be765e4bee WatchSource:0}: Error finding container ccb13232202aa808532d1d316ec8e359abbf99cf0e5348096e3f89be765e4bee: Status 404 returned error can't find the container with id ccb13232202aa808532d1d316ec8e359abbf99cf0e5348096e3f89be765e4bee Dec 02 18:26:56 crc kubenswrapper[4878]: I1202 18:26:56.341026 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" event={"ID":"c5196963-9d92-4f0a-ab8c-47f4b86a685f","Type":"ContainerStarted","Data":"74f27b31d1823f14645e5a19d1c5b3d1cdf1e6b2a1c6aac7232943cff0c1c574"} Dec 02 18:26:56 crc kubenswrapper[4878]: I1202 18:26:56.342986 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7mgrc" event={"ID":"4311fcfc-1cf4-4bab-b946-40efef5b8c10","Type":"ContainerStarted","Data":"69ffabbab72306609831c9cf2a696aec0c70c1eed281798b31323523bf01f453"} Dec 02 18:26:56 crc kubenswrapper[4878]: I1202 18:26:56.344226 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" event={"ID":"bb76342f-0435-4589-826f-3a7cee8cc419","Type":"ContainerStarted","Data":"ccb13232202aa808532d1d316ec8e359abbf99cf0e5348096e3f89be765e4bee"} Dec 02 18:27:00 crc kubenswrapper[4878]: I1202 18:27:00.393684 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" event={"ID":"c5196963-9d92-4f0a-ab8c-47f4b86a685f","Type":"ContainerStarted","Data":"db111c7b02ed6578263dbb598ce084959dd8166790fa4c5aa26f502788c09e0a"} Dec 02 18:27:00 crc kubenswrapper[4878]: I1202 18:27:00.400253 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" event={"ID":"bb76342f-0435-4589-826f-3a7cee8cc419","Type":"ContainerStarted","Data":"b40e8df58025685c20ccdde12f2edbd0d90a411a737557de7f1cdf16edf003bd"} Dec 02 18:27:00 crc kubenswrapper[4878]: I1202 18:27:00.401261 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" Dec 02 18:27:00 crc kubenswrapper[4878]: I1202 18:27:00.414155 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-xhjz5" podStartSLOduration=2.120655965 podStartE2EDuration="5.414131065s" podCreationTimestamp="2025-12-02 18:26:55 +0000 UTC" firstStartedPulling="2025-12-02 18:26:56.020881603 +0000 UTC m=+725.710500484" lastFinishedPulling="2025-12-02 18:26:59.314356683 +0000 UTC m=+729.003975584" observedRunningTime="2025-12-02 18:27:00.410279555 +0000 UTC m=+730.099898456" watchObservedRunningTime="2025-12-02 18:27:00.414131065 +0000 UTC m=+730.103749956" Dec 02 18:27:00 crc kubenswrapper[4878]: I1202 18:27:00.435522 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" podStartSLOduration=2.485592045 podStartE2EDuration="5.435494514s" podCreationTimestamp="2025-12-02 18:26:55 +0000 UTC" firstStartedPulling="2025-12-02 18:26:56.312997555 +0000 UTC m=+726.002616436" lastFinishedPulling="2025-12-02 18:26:59.262900024 +0000 UTC m=+728.952518905" observedRunningTime="2025-12-02 18:27:00.432973065 +0000 UTC m=+730.122591946" watchObservedRunningTime="2025-12-02 18:27:00.435494514 +0000 UTC m=+730.125113385" Dec 02 18:27:01 crc kubenswrapper[4878]: I1202 18:27:01.412604 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7mgrc" event={"ID":"4311fcfc-1cf4-4bab-b946-40efef5b8c10","Type":"ContainerStarted","Data":"76c98d104f6f1ba4ca103b92077af67fedea44ec21e30f051f3b8d3aceaf1a1c"} Dec 02 18:27:01 crc kubenswrapper[4878]: I1202 18:27:01.433318 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-7mgrc" podStartSLOduration=2.089349315 podStartE2EDuration="6.433289253s" podCreationTimestamp="2025-12-02 18:26:55 +0000 UTC" firstStartedPulling="2025-12-02 18:26:56.297941345 +0000 UTC m=+725.987560226" lastFinishedPulling="2025-12-02 18:27:00.641881283 +0000 UTC m=+730.331500164" observedRunningTime="2025-12-02 18:27:01.432320963 +0000 UTC m=+731.121939884" watchObservedRunningTime="2025-12-02 18:27:01.433289253 +0000 UTC m=+731.122908174" Dec 02 18:27:05 crc kubenswrapper[4878]: I1202 18:27:05.745063 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-qfhjc" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.724223 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r"] Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.728586 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.731185 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.741657 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r"] Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.787428 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slngw\" (UniqueName: \"kubernetes.io/projected/abd35f96-61cf-48b3-b66d-c3d54414bb74-kube-api-access-slngw\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.787858 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.788073 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.890225 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.890330 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slngw\" (UniqueName: \"kubernetes.io/projected/abd35f96-61cf-48b3-b66d-c3d54414bb74-kube-api-access-slngw\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.890358 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.891115 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.891338 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.911907 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn"] Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.913406 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.942036 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slngw\" (UniqueName: \"kubernetes.io/projected/abd35f96-61cf-48b3-b66d-c3d54414bb74-kube-api-access-slngw\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.955745 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn"] Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.991705 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.991832 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:28 crc kubenswrapper[4878]: I1202 18:27:28.991878 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4vr8\" (UniqueName: \"kubernetes.io/projected/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-kube-api-access-c4vr8\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.053902 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.094134 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.094633 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.094670 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4vr8\" (UniqueName: \"kubernetes.io/projected/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-kube-api-access-c4vr8\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.094864 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.095268 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.123197 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4vr8\" (UniqueName: \"kubernetes.io/projected/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-kube-api-access-c4vr8\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.275336 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.327135 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r"] Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.633060 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" event={"ID":"abd35f96-61cf-48b3-b66d-c3d54414bb74","Type":"ContainerStarted","Data":"c515050db2380e14a726d5dac87513e9a0538569a53cb441625e7b90b61c6e59"} Dec 02 18:27:29 crc kubenswrapper[4878]: I1202 18:27:29.649108 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn"] Dec 02 18:27:29 crc kubenswrapper[4878]: W1202 18:27:29.652889 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf6b244f_0cc7_4f5d_9522_9fe4e65897a6.slice/crio-b41ae87a771718f5bdb3bb2ac944a4dfc8a79190b962fd1aed4a99071af003e4 WatchSource:0}: Error finding container b41ae87a771718f5bdb3bb2ac944a4dfc8a79190b962fd1aed4a99071af003e4: Status 404 returned error can't find the container with id b41ae87a771718f5bdb3bb2ac944a4dfc8a79190b962fd1aed4a99071af003e4 Dec 02 18:27:30 crc kubenswrapper[4878]: I1202 18:27:30.643122 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" event={"ID":"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6","Type":"ContainerStarted","Data":"b41ae87a771718f5bdb3bb2ac944a4dfc8a79190b962fd1aed4a99071af003e4"} Dec 02 18:27:31 crc kubenswrapper[4878]: I1202 18:27:31.724253 4878 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.470660 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bc5gf"] Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.472835 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.486755 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bc5gf"] Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.553963 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-utilities\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.554186 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j894\" (UniqueName: \"kubernetes.io/projected/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-kube-api-access-6j894\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.554382 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-catalog-content\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.655047 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-utilities\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.655141 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j894\" (UniqueName: \"kubernetes.io/projected/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-kube-api-access-6j894\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.655180 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-catalog-content\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.655970 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-catalog-content\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.655966 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-utilities\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.658435 4878 generic.go:334] "Generic (PLEG): container finished" podID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerID="9a816ccfdc68fd4dec53eff5ed976120c11c885ee1d3fd419d919e9975ec9f36" exitCode=0 Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.658674 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" event={"ID":"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6","Type":"ContainerDied","Data":"9a816ccfdc68fd4dec53eff5ed976120c11c885ee1d3fd419d919e9975ec9f36"} Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.660742 4878 generic.go:334] "Generic (PLEG): container finished" podID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerID="c5056d891f675ac5920eb2fc2d69add4ead01158cd19d3de4e2ae075a744a749" exitCode=0 Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.660943 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" event={"ID":"abd35f96-61cf-48b3-b66d-c3d54414bb74","Type":"ContainerDied","Data":"c5056d891f675ac5920eb2fc2d69add4ead01158cd19d3de4e2ae075a744a749"} Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.707250 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j894\" (UniqueName: \"kubernetes.io/projected/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-kube-api-access-6j894\") pod \"redhat-operators-bc5gf\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:32 crc kubenswrapper[4878]: I1202 18:27:32.792105 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:33 crc kubenswrapper[4878]: I1202 18:27:33.061776 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bc5gf"] Dec 02 18:27:33 crc kubenswrapper[4878]: I1202 18:27:33.669391 4878 generic.go:334] "Generic (PLEG): container finished" podID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerID="2202862d50c907b70c29f08e232c4f0ff49f41681db57da8cef79322e958d63c" exitCode=0 Dec 02 18:27:33 crc kubenswrapper[4878]: I1202 18:27:33.669453 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bc5gf" event={"ID":"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4","Type":"ContainerDied","Data":"2202862d50c907b70c29f08e232c4f0ff49f41681db57da8cef79322e958d63c"} Dec 02 18:27:33 crc kubenswrapper[4878]: I1202 18:27:33.669495 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bc5gf" event={"ID":"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4","Type":"ContainerStarted","Data":"b5ff08a714819aa1db9fc5b2ccb124f3e4b92768af849a833d2c30086f099d48"} Dec 02 18:27:36 crc kubenswrapper[4878]: I1202 18:27:36.692476 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bc5gf" event={"ID":"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4","Type":"ContainerStarted","Data":"0d64a0358f392806020da31537ee25c71c9f30757896fa77487188a87d22acfd"} Dec 02 18:27:36 crc kubenswrapper[4878]: I1202 18:27:36.694109 4878 generic.go:334] "Generic (PLEG): container finished" podID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerID="a39585a213da462df4b378b4732be4d77bd5c542831f06124592e5fdd8d1c3bc" exitCode=0 Dec 02 18:27:36 crc kubenswrapper[4878]: I1202 18:27:36.694168 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" event={"ID":"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6","Type":"ContainerDied","Data":"a39585a213da462df4b378b4732be4d77bd5c542831f06124592e5fdd8d1c3bc"} Dec 02 18:27:36 crc kubenswrapper[4878]: I1202 18:27:36.695377 4878 generic.go:334] "Generic (PLEG): container finished" podID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerID="e169f8f09b394affc973e179abf9e77e345c4838d84f5becb3990a57b71a8f07" exitCode=0 Dec 02 18:27:36 crc kubenswrapper[4878]: I1202 18:27:36.695409 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" event={"ID":"abd35f96-61cf-48b3-b66d-c3d54414bb74","Type":"ContainerDied","Data":"e169f8f09b394affc973e179abf9e77e345c4838d84f5becb3990a57b71a8f07"} Dec 02 18:27:37 crc kubenswrapper[4878]: I1202 18:27:37.708144 4878 generic.go:334] "Generic (PLEG): container finished" podID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerID="0d64a0358f392806020da31537ee25c71c9f30757896fa77487188a87d22acfd" exitCode=0 Dec 02 18:27:37 crc kubenswrapper[4878]: I1202 18:27:37.708308 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bc5gf" event={"ID":"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4","Type":"ContainerDied","Data":"0d64a0358f392806020da31537ee25c71c9f30757896fa77487188a87d22acfd"} Dec 02 18:27:38 crc kubenswrapper[4878]: I1202 18:27:38.892700 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7jtp"] Dec 02 18:27:38 crc kubenswrapper[4878]: I1202 18:27:38.895468 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:38 crc kubenswrapper[4878]: I1202 18:27:38.896041 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7jtp"] Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.083762 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-utilities\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.083875 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcffl\" (UniqueName: \"kubernetes.io/projected/798768bf-91b5-42ea-a0c4-20feb23686a6-kube-api-access-kcffl\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.083966 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-catalog-content\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.185578 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-catalog-content\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.185690 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-utilities\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.185771 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcffl\" (UniqueName: \"kubernetes.io/projected/798768bf-91b5-42ea-a0c4-20feb23686a6-kube-api-access-kcffl\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.186215 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-utilities\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.186614 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-catalog-content\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.215057 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcffl\" (UniqueName: \"kubernetes.io/projected/798768bf-91b5-42ea-a0c4-20feb23686a6-kube-api-access-kcffl\") pod \"certified-operators-p7jtp\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.232282 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.734099 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bc5gf" event={"ID":"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4","Type":"ContainerStarted","Data":"a149e4fc002cc3c09bb325a334945846e812eb7be65ce92cceaf0c4feb17eb51"} Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.737081 4878 generic.go:334] "Generic (PLEG): container finished" podID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerID="c896878a499b2121dcc4f5e287deb1bf80c2253500fb83f5905f30b692d061b0" exitCode=0 Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.737178 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" event={"ID":"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6","Type":"ContainerDied","Data":"c896878a499b2121dcc4f5e287deb1bf80c2253500fb83f5905f30b692d061b0"} Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.739975 4878 generic.go:334] "Generic (PLEG): container finished" podID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerID="c1bc27839b181f4dcdfb63678f472f638615e77353a5ed05bb97ef26144b45af" exitCode=0 Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.740016 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" event={"ID":"abd35f96-61cf-48b3-b66d-c3d54414bb74","Type":"ContainerDied","Data":"c1bc27839b181f4dcdfb63678f472f638615e77353a5ed05bb97ef26144b45af"} Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.764629 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bc5gf" podStartSLOduration=2.3755250549999998 podStartE2EDuration="7.764603825s" podCreationTimestamp="2025-12-02 18:27:32 +0000 UTC" firstStartedPulling="2025-12-02 18:27:33.671905812 +0000 UTC m=+763.361524693" lastFinishedPulling="2025-12-02 18:27:39.060984582 +0000 UTC m=+768.750603463" observedRunningTime="2025-12-02 18:27:39.758990259 +0000 UTC m=+769.448609160" watchObservedRunningTime="2025-12-02 18:27:39.764603825 +0000 UTC m=+769.454222706" Dec 02 18:27:39 crc kubenswrapper[4878]: I1202 18:27:39.783151 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7jtp"] Dec 02 18:27:40 crc kubenswrapper[4878]: I1202 18:27:40.751206 4878 generic.go:334] "Generic (PLEG): container finished" podID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerID="31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4" exitCode=0 Dec 02 18:27:40 crc kubenswrapper[4878]: I1202 18:27:40.751290 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7jtp" event={"ID":"798768bf-91b5-42ea-a0c4-20feb23686a6","Type":"ContainerDied","Data":"31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4"} Dec 02 18:27:40 crc kubenswrapper[4878]: I1202 18:27:40.751954 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7jtp" event={"ID":"798768bf-91b5-42ea-a0c4-20feb23686a6","Type":"ContainerStarted","Data":"6237712bf4d76906c67d1d0784c872dad199db30007bf20f63e5fd80d42c073a"} Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.037683 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.122671 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.218219 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slngw\" (UniqueName: \"kubernetes.io/projected/abd35f96-61cf-48b3-b66d-c3d54414bb74-kube-api-access-slngw\") pod \"abd35f96-61cf-48b3-b66d-c3d54414bb74\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.218833 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-util\") pod \"abd35f96-61cf-48b3-b66d-c3d54414bb74\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.219005 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-util\") pod \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.219201 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-bundle\") pod \"abd35f96-61cf-48b3-b66d-c3d54414bb74\" (UID: \"abd35f96-61cf-48b3-b66d-c3d54414bb74\") " Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.219958 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-bundle" (OuterVolumeSpecName: "bundle") pod "abd35f96-61cf-48b3-b66d-c3d54414bb74" (UID: "abd35f96-61cf-48b3-b66d-c3d54414bb74"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.224269 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd35f96-61cf-48b3-b66d-c3d54414bb74-kube-api-access-slngw" (OuterVolumeSpecName: "kube-api-access-slngw") pod "abd35f96-61cf-48b3-b66d-c3d54414bb74" (UID: "abd35f96-61cf-48b3-b66d-c3d54414bb74"). InnerVolumeSpecName "kube-api-access-slngw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.232275 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-util" (OuterVolumeSpecName: "util") pod "bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" (UID: "bf6b244f-0cc7-4f5d-9522-9fe4e65897a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.236248 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-util" (OuterVolumeSpecName: "util") pod "abd35f96-61cf-48b3-b66d-c3d54414bb74" (UID: "abd35f96-61cf-48b3-b66d-c3d54414bb74"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.321770 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-bundle\") pod \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.322327 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4vr8\" (UniqueName: \"kubernetes.io/projected/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-kube-api-access-c4vr8\") pod \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\" (UID: \"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6\") " Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.326913 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-kube-api-access-c4vr8" (OuterVolumeSpecName: "kube-api-access-c4vr8") pod "bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" (UID: "bf6b244f-0cc7-4f5d-9522-9fe4e65897a6"). InnerVolumeSpecName "kube-api-access-c4vr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.327058 4878 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-util\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.327090 4878 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-util\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.327103 4878 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd35f96-61cf-48b3-b66d-c3d54414bb74-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.327122 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slngw\" (UniqueName: \"kubernetes.io/projected/abd35f96-61cf-48b3-b66d-c3d54414bb74-kube-api-access-slngw\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.327699 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-bundle" (OuterVolumeSpecName: "bundle") pod "bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" (UID: "bf6b244f-0cc7-4f5d-9522-9fe4e65897a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.429422 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4vr8\" (UniqueName: \"kubernetes.io/projected/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-kube-api-access-c4vr8\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.429755 4878 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf6b244f-0cc7-4f5d-9522-9fe4e65897a6-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.762295 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" event={"ID":"bf6b244f-0cc7-4f5d-9522-9fe4e65897a6","Type":"ContainerDied","Data":"b41ae87a771718f5bdb3bb2ac944a4dfc8a79190b962fd1aed4a99071af003e4"} Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.762351 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.762360 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41ae87a771718f5bdb3bb2ac944a4dfc8a79190b962fd1aed4a99071af003e4" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.764495 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" event={"ID":"abd35f96-61cf-48b3-b66d-c3d54414bb74","Type":"ContainerDied","Data":"c515050db2380e14a726d5dac87513e9a0538569a53cb441625e7b90b61c6e59"} Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.764526 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c515050db2380e14a726d5dac87513e9a0538569a53cb441625e7b90b61c6e59" Dec 02 18:27:41 crc kubenswrapper[4878]: I1202 18:27:41.764587 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r" Dec 02 18:27:42 crc kubenswrapper[4878]: I1202 18:27:42.792349 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:42 crc kubenswrapper[4878]: I1202 18:27:42.792718 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:43 crc kubenswrapper[4878]: I1202 18:27:43.862136 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bc5gf" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="registry-server" probeResult="failure" output=< Dec 02 18:27:43 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 18:27:43 crc kubenswrapper[4878]: > Dec 02 18:27:44 crc kubenswrapper[4878]: I1202 18:27:44.789779 4878 generic.go:334] "Generic (PLEG): container finished" podID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerID="d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02" exitCode=0 Dec 02 18:27:44 crc kubenswrapper[4878]: I1202 18:27:44.789848 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7jtp" event={"ID":"798768bf-91b5-42ea-a0c4-20feb23686a6","Type":"ContainerDied","Data":"d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02"} Dec 02 18:27:45 crc kubenswrapper[4878]: I1202 18:27:45.801935 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7jtp" event={"ID":"798768bf-91b5-42ea-a0c4-20feb23686a6","Type":"ContainerStarted","Data":"85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613"} Dec 02 18:27:45 crc kubenswrapper[4878]: I1202 18:27:45.835988 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7jtp" podStartSLOduration=3.350736958 podStartE2EDuration="7.835961949s" podCreationTimestamp="2025-12-02 18:27:38 +0000 UTC" firstStartedPulling="2025-12-02 18:27:40.75682812 +0000 UTC m=+770.446447011" lastFinishedPulling="2025-12-02 18:27:45.242053121 +0000 UTC m=+774.931672002" observedRunningTime="2025-12-02 18:27:45.830694345 +0000 UTC m=+775.520313226" watchObservedRunningTime="2025-12-02 18:27:45.835961949 +0000 UTC m=+775.525580830" Dec 02 18:27:49 crc kubenswrapper[4878]: I1202 18:27:49.232665 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:49 crc kubenswrapper[4878]: I1202 18:27:49.233127 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:49 crc kubenswrapper[4878]: I1202 18:27:49.277932 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.534277 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m"] Dec 02 18:27:51 crc kubenswrapper[4878]: E1202 18:27:51.535112 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerName="extract" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535134 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerName="extract" Dec 02 18:27:51 crc kubenswrapper[4878]: E1202 18:27:51.535158 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerName="util" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535166 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerName="util" Dec 02 18:27:51 crc kubenswrapper[4878]: E1202 18:27:51.535186 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerName="util" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535195 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerName="util" Dec 02 18:27:51 crc kubenswrapper[4878]: E1202 18:27:51.535216 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerName="extract" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535227 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerName="extract" Dec 02 18:27:51 crc kubenswrapper[4878]: E1202 18:27:51.535264 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerName="pull" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535272 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerName="pull" Dec 02 18:27:51 crc kubenswrapper[4878]: E1202 18:27:51.535287 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerName="pull" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535293 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerName="pull" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535430 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6b244f-0cc7-4f5d-9522-9fe4e65897a6" containerName="extract" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.535447 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd35f96-61cf-48b3-b66d-c3d54414bb74" containerName="extract" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.536449 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.544209 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.544925 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.545075 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.545118 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-hj5k6" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.545125 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.545271 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.547952 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m"] Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.619000 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-manager-config\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.619085 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-apiservice-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.619228 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.619442 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45j24\" (UniqueName: \"kubernetes.io/projected/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-kube-api-access-45j24\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.619485 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-webhook-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.720981 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45j24\" (UniqueName: \"kubernetes.io/projected/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-kube-api-access-45j24\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.721047 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-webhook-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.721101 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-manager-config\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.721143 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-apiservice-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.721413 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.722870 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-manager-config\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.731340 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-apiservice-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.732424 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-webhook-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.734917 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.741502 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45j24\" (UniqueName: \"kubernetes.io/projected/6ace3da2-70e9-4d80-a8ad-5a8e1bb062df-kube-api-access-45j24\") pod \"loki-operator-controller-manager-846f878689-bhh7m\" (UID: \"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:51 crc kubenswrapper[4878]: I1202 18:27:51.857582 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:27:52 crc kubenswrapper[4878]: I1202 18:27:52.390125 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m"] Dec 02 18:27:52 crc kubenswrapper[4878]: W1202 18:27:52.394609 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ace3da2_70e9_4d80_a8ad_5a8e1bb062df.slice/crio-1491835b84a1e5bad45c01336792408ddc274f4c016e2741014682a58adfe4bc WatchSource:0}: Error finding container 1491835b84a1e5bad45c01336792408ddc274f4c016e2741014682a58adfe4bc: Status 404 returned error can't find the container with id 1491835b84a1e5bad45c01336792408ddc274f4c016e2741014682a58adfe4bc Dec 02 18:27:52 crc kubenswrapper[4878]: I1202 18:27:52.881104 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" event={"ID":"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df","Type":"ContainerStarted","Data":"1491835b84a1e5bad45c01336792408ddc274f4c016e2741014682a58adfe4bc"} Dec 02 18:27:52 crc kubenswrapper[4878]: I1202 18:27:52.894535 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:52 crc kubenswrapper[4878]: I1202 18:27:52.953090 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.034950 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-wpfmg"] Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.036357 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.039792 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-z58jf" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.040137 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.040396 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.064569 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-wpfmg"] Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.145720 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xckgk\" (UniqueName: \"kubernetes.io/projected/056447cf-55f2-4ddd-bf36-7f0f637f5ca5-kube-api-access-xckgk\") pod \"cluster-logging-operator-ff9846bd-wpfmg\" (UID: \"056447cf-55f2-4ddd-bf36-7f0f637f5ca5\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.248272 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xckgk\" (UniqueName: \"kubernetes.io/projected/056447cf-55f2-4ddd-bf36-7f0f637f5ca5-kube-api-access-xckgk\") pod \"cluster-logging-operator-ff9846bd-wpfmg\" (UID: \"056447cf-55f2-4ddd-bf36-7f0f637f5ca5\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.283641 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xckgk\" (UniqueName: \"kubernetes.io/projected/056447cf-55f2-4ddd-bf36-7f0f637f5ca5-kube-api-access-xckgk\") pod \"cluster-logging-operator-ff9846bd-wpfmg\" (UID: \"056447cf-55f2-4ddd-bf36-7f0f637f5ca5\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.356105 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.675628 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-wpfmg"] Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.742983 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.743085 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:27:53 crc kubenswrapper[4878]: I1202 18:27:53.891868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" event={"ID":"056447cf-55f2-4ddd-bf36-7f0f637f5ca5","Type":"ContainerStarted","Data":"1122d45ef04e5bdfa6e83eefca9454b44e2e0eb58013c1347f0dff005b18c413"} Dec 02 18:27:56 crc kubenswrapper[4878]: I1202 18:27:56.060916 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bc5gf"] Dec 02 18:27:56 crc kubenswrapper[4878]: I1202 18:27:56.061265 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bc5gf" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="registry-server" containerID="cri-o://a149e4fc002cc3c09bb325a334945846e812eb7be65ce92cceaf0c4feb17eb51" gracePeriod=2 Dec 02 18:27:56 crc kubenswrapper[4878]: I1202 18:27:56.924222 4878 generic.go:334] "Generic (PLEG): container finished" podID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerID="a149e4fc002cc3c09bb325a334945846e812eb7be65ce92cceaf0c4feb17eb51" exitCode=0 Dec 02 18:27:56 crc kubenswrapper[4878]: I1202 18:27:56.924681 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bc5gf" event={"ID":"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4","Type":"ContainerDied","Data":"a149e4fc002cc3c09bb325a334945846e812eb7be65ce92cceaf0c4feb17eb51"} Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.519063 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.531158 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-catalog-content\") pod \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.531218 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-utilities\") pod \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.531356 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j894\" (UniqueName: \"kubernetes.io/projected/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-kube-api-access-6j894\") pod \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\" (UID: \"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4\") " Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.532417 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-utilities" (OuterVolumeSpecName: "utilities") pod "20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" (UID: "20d9a3f7-be6d-4355-ae23-38ff91ff1fe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.540323 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-kube-api-access-6j894" (OuterVolumeSpecName: "kube-api-access-6j894") pod "20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" (UID: "20d9a3f7-be6d-4355-ae23-38ff91ff1fe4"). InnerVolumeSpecName "kube-api-access-6j894". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.634807 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j894\" (UniqueName: \"kubernetes.io/projected/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-kube-api-access-6j894\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.634858 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.658512 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" (UID: "20d9a3f7-be6d-4355-ae23-38ff91ff1fe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.736650 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.936493 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bc5gf" event={"ID":"20d9a3f7-be6d-4355-ae23-38ff91ff1fe4","Type":"ContainerDied","Data":"b5ff08a714819aa1db9fc5b2ccb124f3e4b92768af849a833d2c30086f099d48"} Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.936566 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bc5gf" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.936579 4878 scope.go:117] "RemoveContainer" containerID="a149e4fc002cc3c09bb325a334945846e812eb7be65ce92cceaf0c4feb17eb51" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.972637 4878 scope.go:117] "RemoveContainer" containerID="0d64a0358f392806020da31537ee25c71c9f30757896fa77487188a87d22acfd" Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.979369 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bc5gf"] Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.989293 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bc5gf"] Dec 02 18:27:57 crc kubenswrapper[4878]: I1202 18:27:57.994504 4878 scope.go:117] "RemoveContainer" containerID="2202862d50c907b70c29f08e232c4f0ff49f41681db57da8cef79322e958d63c" Dec 02 18:27:58 crc kubenswrapper[4878]: I1202 18:27:58.951172 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" path="/var/lib/kubelet/pods/20d9a3f7-be6d-4355-ae23-38ff91ff1fe4/volumes" Dec 02 18:27:59 crc kubenswrapper[4878]: I1202 18:27:59.277540 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.082152 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w5bg9"] Dec 02 18:28:00 crc kubenswrapper[4878]: E1202 18:28:00.082444 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="registry-server" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.082455 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="registry-server" Dec 02 18:28:00 crc kubenswrapper[4878]: E1202 18:28:00.082471 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="extract-content" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.082480 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="extract-content" Dec 02 18:28:00 crc kubenswrapper[4878]: E1202 18:28:00.082490 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="extract-utilities" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.082496 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="extract-utilities" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.082620 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d9a3f7-be6d-4355-ae23-38ff91ff1fe4" containerName="registry-server" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.083691 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.096514 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5bg9"] Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.179348 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-utilities\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.179629 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-catalog-content\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.179742 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ngmt\" (UniqueName: \"kubernetes.io/projected/76f07be8-4068-4902-a7a6-693cd1c6966b-kube-api-access-4ngmt\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.281630 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-utilities\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.281748 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-catalog-content\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.281797 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ngmt\" (UniqueName: \"kubernetes.io/projected/76f07be8-4068-4902-a7a6-693cd1c6966b-kube-api-access-4ngmt\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.282414 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-utilities\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.282489 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-catalog-content\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.317762 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ngmt\" (UniqueName: \"kubernetes.io/projected/76f07be8-4068-4902-a7a6-693cd1c6966b-kube-api-access-4ngmt\") pod \"redhat-marketplace-w5bg9\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:00 crc kubenswrapper[4878]: I1202 18:28:00.403742 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.267352 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7jtp"] Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.268407 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7jtp" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="registry-server" containerID="cri-o://85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613" gracePeriod=2 Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.836779 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.974664 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-catalog-content\") pod \"798768bf-91b5-42ea-a0c4-20feb23686a6\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.975272 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-utilities\") pod \"798768bf-91b5-42ea-a0c4-20feb23686a6\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.975381 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcffl\" (UniqueName: \"kubernetes.io/projected/798768bf-91b5-42ea-a0c4-20feb23686a6-kube-api-access-kcffl\") pod \"798768bf-91b5-42ea-a0c4-20feb23686a6\" (UID: \"798768bf-91b5-42ea-a0c4-20feb23686a6\") " Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.976215 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-utilities" (OuterVolumeSpecName: "utilities") pod "798768bf-91b5-42ea-a0c4-20feb23686a6" (UID: "798768bf-91b5-42ea-a0c4-20feb23686a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.983632 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798768bf-91b5-42ea-a0c4-20feb23686a6-kube-api-access-kcffl" (OuterVolumeSpecName: "kube-api-access-kcffl") pod "798768bf-91b5-42ea-a0c4-20feb23686a6" (UID: "798768bf-91b5-42ea-a0c4-20feb23686a6"). InnerVolumeSpecName "kube-api-access-kcffl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:28:04 crc kubenswrapper[4878]: I1202 18:28:04.999312 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5bg9"] Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.036324 4878 generic.go:334] "Generic (PLEG): container finished" podID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerID="85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613" exitCode=0 Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.036639 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7jtp" event={"ID":"798768bf-91b5-42ea-a0c4-20feb23686a6","Type":"ContainerDied","Data":"85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613"} Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.036738 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7jtp" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.036776 4878 scope.go:117] "RemoveContainer" containerID="85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.036754 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7jtp" event={"ID":"798768bf-91b5-42ea-a0c4-20feb23686a6","Type":"ContainerDied","Data":"6237712bf4d76906c67d1d0784c872dad199db30007bf20f63e5fd80d42c073a"} Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.058605 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" event={"ID":"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df","Type":"ContainerStarted","Data":"6ebed23bc211fa69931d91454eb0dc2d26bb172cb79f906d896fc8ad80586060"} Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.059183 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "798768bf-91b5-42ea-a0c4-20feb23686a6" (UID: "798768bf-91b5-42ea-a0c4-20feb23686a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.068669 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" event={"ID":"056447cf-55f2-4ddd-bf36-7f0f637f5ca5","Type":"ContainerStarted","Data":"73cdcd0a47af4dbb33b1777e6ab87db5af7e6cc6b36d50091dc62d6a0667eef0"} Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.076016 4878 scope.go:117] "RemoveContainer" containerID="d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.079207 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.079275 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcffl\" (UniqueName: \"kubernetes.io/projected/798768bf-91b5-42ea-a0c4-20feb23686a6-kube-api-access-kcffl\") on node \"crc\" DevicePath \"\"" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.079290 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/798768bf-91b5-42ea-a0c4-20feb23686a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.123470 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-wpfmg" podStartSLOduration=1.251433759 podStartE2EDuration="12.123442257s" podCreationTimestamp="2025-12-02 18:27:53 +0000 UTC" firstStartedPulling="2025-12-02 18:27:53.69168106 +0000 UTC m=+783.381299951" lastFinishedPulling="2025-12-02 18:28:04.563689568 +0000 UTC m=+794.253308449" observedRunningTime="2025-12-02 18:28:05.118807762 +0000 UTC m=+794.808426663" watchObservedRunningTime="2025-12-02 18:28:05.123442257 +0000 UTC m=+794.813061148" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.131553 4878 scope.go:117] "RemoveContainer" containerID="31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.172998 4878 scope.go:117] "RemoveContainer" containerID="85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613" Dec 02 18:28:05 crc kubenswrapper[4878]: E1202 18:28:05.177091 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613\": container with ID starting with 85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613 not found: ID does not exist" containerID="85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.177150 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613"} err="failed to get container status \"85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613\": rpc error: code = NotFound desc = could not find container \"85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613\": container with ID starting with 85a01b79ffa5165d59eda1e20f6023241e3a63208034771a934f9783b1809613 not found: ID does not exist" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.177183 4878 scope.go:117] "RemoveContainer" containerID="d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02" Dec 02 18:28:05 crc kubenswrapper[4878]: E1202 18:28:05.178416 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02\": container with ID starting with d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02 not found: ID does not exist" containerID="d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.178472 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02"} err="failed to get container status \"d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02\": rpc error: code = NotFound desc = could not find container \"d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02\": container with ID starting with d2bd70591313ee2f1b8e2cd275ef438ec3c63c07125455fcfaea6b07d6fe1f02 not found: ID does not exist" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.178509 4878 scope.go:117] "RemoveContainer" containerID="31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4" Dec 02 18:28:05 crc kubenswrapper[4878]: E1202 18:28:05.181512 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4\": container with ID starting with 31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4 not found: ID does not exist" containerID="31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.181587 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4"} err="failed to get container status \"31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4\": rpc error: code = NotFound desc = could not find container \"31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4\": container with ID starting with 31e053d300808405f923ae4c9b6544d6beabc18d38f66617e0eadea5a05d54e4 not found: ID does not exist" Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.367865 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7jtp"] Dec 02 18:28:05 crc kubenswrapper[4878]: I1202 18:28:05.371859 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7jtp"] Dec 02 18:28:06 crc kubenswrapper[4878]: I1202 18:28:06.079046 4878 generic.go:334] "Generic (PLEG): container finished" podID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerID="56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5" exitCode=0 Dec 02 18:28:06 crc kubenswrapper[4878]: I1202 18:28:06.079156 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5bg9" event={"ID":"76f07be8-4068-4902-a7a6-693cd1c6966b","Type":"ContainerDied","Data":"56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5"} Dec 02 18:28:06 crc kubenswrapper[4878]: I1202 18:28:06.079248 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5bg9" event={"ID":"76f07be8-4068-4902-a7a6-693cd1c6966b","Type":"ContainerStarted","Data":"bde19040277ca46c1139eb29f58f219542e16371e438975b162ec868138122bd"} Dec 02 18:28:06 crc kubenswrapper[4878]: I1202 18:28:06.949531 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" path="/var/lib/kubelet/pods/798768bf-91b5-42ea-a0c4-20feb23686a6/volumes" Dec 02 18:28:08 crc kubenswrapper[4878]: I1202 18:28:08.113149 4878 generic.go:334] "Generic (PLEG): container finished" podID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerID="d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b" exitCode=0 Dec 02 18:28:08 crc kubenswrapper[4878]: I1202 18:28:08.113224 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5bg9" event={"ID":"76f07be8-4068-4902-a7a6-693cd1c6966b","Type":"ContainerDied","Data":"d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b"} Dec 02 18:28:13 crc kubenswrapper[4878]: I1202 18:28:13.163802 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5bg9" event={"ID":"76f07be8-4068-4902-a7a6-693cd1c6966b","Type":"ContainerStarted","Data":"2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588"} Dec 02 18:28:13 crc kubenswrapper[4878]: I1202 18:28:13.168961 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" event={"ID":"6ace3da2-70e9-4d80-a8ad-5a8e1bb062df","Type":"ContainerStarted","Data":"021fdc371b987f7396386c4f8496893b54538347b5cc0d02a71583746698dd10"} Dec 02 18:28:13 crc kubenswrapper[4878]: I1202 18:28:13.169200 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:28:13 crc kubenswrapper[4878]: I1202 18:28:13.173269 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" Dec 02 18:28:13 crc kubenswrapper[4878]: I1202 18:28:13.190887 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w5bg9" podStartSLOduration=7.066557075 podStartE2EDuration="13.190856326s" podCreationTimestamp="2025-12-02 18:28:00 +0000 UTC" firstStartedPulling="2025-12-02 18:28:06.082756683 +0000 UTC m=+795.772375574" lastFinishedPulling="2025-12-02 18:28:12.207055944 +0000 UTC m=+801.896674825" observedRunningTime="2025-12-02 18:28:13.184218498 +0000 UTC m=+802.873837379" watchObservedRunningTime="2025-12-02 18:28:13.190856326 +0000 UTC m=+802.880475207" Dec 02 18:28:13 crc kubenswrapper[4878]: I1202 18:28:13.214080 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-846f878689-bhh7m" podStartSLOduration=2.219006725 podStartE2EDuration="22.214055711s" podCreationTimestamp="2025-12-02 18:27:51 +0000 UTC" firstStartedPulling="2025-12-02 18:27:52.397822724 +0000 UTC m=+782.087441595" lastFinishedPulling="2025-12-02 18:28:12.3928717 +0000 UTC m=+802.082490581" observedRunningTime="2025-12-02 18:28:13.212735761 +0000 UTC m=+802.902354642" watchObservedRunningTime="2025-12-02 18:28:13.214055711 +0000 UTC m=+802.903674592" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.158679 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 02 18:28:18 crc kubenswrapper[4878]: E1202 18:28:18.159897 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="extract-content" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.159923 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="extract-content" Dec 02 18:28:18 crc kubenswrapper[4878]: E1202 18:28:18.159933 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="registry-server" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.159940 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="registry-server" Dec 02 18:28:18 crc kubenswrapper[4878]: E1202 18:28:18.159949 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="extract-utilities" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.159957 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="extract-utilities" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.160122 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="798768bf-91b5-42ea-a0c4-20feb23686a6" containerName="registry-server" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.160767 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.163419 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.163709 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.170697 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.344494 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwsbq\" (UniqueName: \"kubernetes.io/projected/e3b4eaaa-a0cc-4717-9823-292843c3e8bf-kube-api-access-mwsbq\") pod \"minio\" (UID: \"e3b4eaaa-a0cc-4717-9823-292843c3e8bf\") " pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.344572 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-539a5d05-2d0a-4998-ac52-5ed708774294\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-539a5d05-2d0a-4998-ac52-5ed708774294\") pod \"minio\" (UID: \"e3b4eaaa-a0cc-4717-9823-292843c3e8bf\") " pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.446075 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwsbq\" (UniqueName: \"kubernetes.io/projected/e3b4eaaa-a0cc-4717-9823-292843c3e8bf-kube-api-access-mwsbq\") pod \"minio\" (UID: \"e3b4eaaa-a0cc-4717-9823-292843c3e8bf\") " pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.446159 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-539a5d05-2d0a-4998-ac52-5ed708774294\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-539a5d05-2d0a-4998-ac52-5ed708774294\") pod \"minio\" (UID: \"e3b4eaaa-a0cc-4717-9823-292843c3e8bf\") " pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.451338 4878 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.451392 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-539a5d05-2d0a-4998-ac52-5ed708774294\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-539a5d05-2d0a-4998-ac52-5ed708774294\") pod \"minio\" (UID: \"e3b4eaaa-a0cc-4717-9823-292843c3e8bf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7f6c7c1c52b4d6600ee9af070bddad4afd24776ed241d1a11d5ce461713c09f/globalmount\"" pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.473120 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwsbq\" (UniqueName: \"kubernetes.io/projected/e3b4eaaa-a0cc-4717-9823-292843c3e8bf-kube-api-access-mwsbq\") pod \"minio\" (UID: \"e3b4eaaa-a0cc-4717-9823-292843c3e8bf\") " pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.483515 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-539a5d05-2d0a-4998-ac52-5ed708774294\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-539a5d05-2d0a-4998-ac52-5ed708774294\") pod \"minio\" (UID: \"e3b4eaaa-a0cc-4717-9823-292843c3e8bf\") " pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.492228 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 02 18:28:18 crc kubenswrapper[4878]: I1202 18:28:18.951611 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 02 18:28:19 crc kubenswrapper[4878]: I1202 18:28:19.218619 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e3b4eaaa-a0cc-4717-9823-292843c3e8bf","Type":"ContainerStarted","Data":"8788356b8a4f060d17cf858accf75141fdbf73c0133cc941cc6e1ce36f3e19cf"} Dec 02 18:28:20 crc kubenswrapper[4878]: I1202 18:28:20.404491 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:20 crc kubenswrapper[4878]: I1202 18:28:20.404823 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:20 crc kubenswrapper[4878]: I1202 18:28:20.467347 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:21 crc kubenswrapper[4878]: I1202 18:28:21.282091 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:21 crc kubenswrapper[4878]: I1202 18:28:21.339124 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5bg9"] Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.249021 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w5bg9" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="registry-server" containerID="cri-o://2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588" gracePeriod=2 Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.249795 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e3b4eaaa-a0cc-4717-9823-292843c3e8bf","Type":"ContainerStarted","Data":"383bad3da4af5e8110c6792b7bf884eb564ae2fdd130f2e712dc546957aed86e"} Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.280490 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.990866867 podStartE2EDuration="8.280470196s" podCreationTimestamp="2025-12-02 18:28:15 +0000 UTC" firstStartedPulling="2025-12-02 18:28:18.956483831 +0000 UTC m=+808.646102712" lastFinishedPulling="2025-12-02 18:28:22.24608716 +0000 UTC m=+811.935706041" observedRunningTime="2025-12-02 18:28:23.27549971 +0000 UTC m=+812.965118591" watchObservedRunningTime="2025-12-02 18:28:23.280470196 +0000 UTC m=+812.970089077" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.646270 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.742306 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.742675 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.847931 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-catalog-content\") pod \"76f07be8-4068-4902-a7a6-693cd1c6966b\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.848007 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-utilities\") pod \"76f07be8-4068-4902-a7a6-693cd1c6966b\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.848138 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ngmt\" (UniqueName: \"kubernetes.io/projected/76f07be8-4068-4902-a7a6-693cd1c6966b-kube-api-access-4ngmt\") pod \"76f07be8-4068-4902-a7a6-693cd1c6966b\" (UID: \"76f07be8-4068-4902-a7a6-693cd1c6966b\") " Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.848985 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-utilities" (OuterVolumeSpecName: "utilities") pod "76f07be8-4068-4902-a7a6-693cd1c6966b" (UID: "76f07be8-4068-4902-a7a6-693cd1c6966b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.861790 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f07be8-4068-4902-a7a6-693cd1c6966b-kube-api-access-4ngmt" (OuterVolumeSpecName: "kube-api-access-4ngmt") pod "76f07be8-4068-4902-a7a6-693cd1c6966b" (UID: "76f07be8-4068-4902-a7a6-693cd1c6966b"). InnerVolumeSpecName "kube-api-access-4ngmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.873270 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76f07be8-4068-4902-a7a6-693cd1c6966b" (UID: "76f07be8-4068-4902-a7a6-693cd1c6966b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.950690 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.950738 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f07be8-4068-4902-a7a6-693cd1c6966b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:28:23 crc kubenswrapper[4878]: I1202 18:28:23.950751 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ngmt\" (UniqueName: \"kubernetes.io/projected/76f07be8-4068-4902-a7a6-693cd1c6966b-kube-api-access-4ngmt\") on node \"crc\" DevicePath \"\"" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.257898 4878 generic.go:334] "Generic (PLEG): container finished" podID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerID="2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588" exitCode=0 Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.257974 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w5bg9" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.257968 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5bg9" event={"ID":"76f07be8-4068-4902-a7a6-693cd1c6966b","Type":"ContainerDied","Data":"2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588"} Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.258072 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w5bg9" event={"ID":"76f07be8-4068-4902-a7a6-693cd1c6966b","Type":"ContainerDied","Data":"bde19040277ca46c1139eb29f58f219542e16371e438975b162ec868138122bd"} Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.258098 4878 scope.go:117] "RemoveContainer" containerID="2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.281311 4878 scope.go:117] "RemoveContainer" containerID="d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.299126 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5bg9"] Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.306537 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w5bg9"] Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.323085 4878 scope.go:117] "RemoveContainer" containerID="56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.361592 4878 scope.go:117] "RemoveContainer" containerID="2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588" Dec 02 18:28:24 crc kubenswrapper[4878]: E1202 18:28:24.362610 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588\": container with ID starting with 2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588 not found: ID does not exist" containerID="2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.362658 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588"} err="failed to get container status \"2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588\": rpc error: code = NotFound desc = could not find container \"2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588\": container with ID starting with 2e42758b0419242ee1c915c54c9cb718c611db62d3181886445e5045b3f34588 not found: ID does not exist" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.362707 4878 scope.go:117] "RemoveContainer" containerID="d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b" Dec 02 18:28:24 crc kubenswrapper[4878]: E1202 18:28:24.363044 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b\": container with ID starting with d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b not found: ID does not exist" containerID="d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.363089 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b"} err="failed to get container status \"d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b\": rpc error: code = NotFound desc = could not find container \"d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b\": container with ID starting with d42c277409997e33889dfae265c62c686b9531ee14f1d274dc47c3e3f487a55b not found: ID does not exist" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.363109 4878 scope.go:117] "RemoveContainer" containerID="56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5" Dec 02 18:28:24 crc kubenswrapper[4878]: E1202 18:28:24.363458 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5\": container with ID starting with 56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5 not found: ID does not exist" containerID="56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.363508 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5"} err="failed to get container status \"56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5\": rpc error: code = NotFound desc = could not find container \"56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5\": container with ID starting with 56173b847216c20576824e1fd6975555565d81c20feddd912fbd139a7a5d8ff5 not found: ID does not exist" Dec 02 18:28:24 crc kubenswrapper[4878]: I1202 18:28:24.947080 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" path="/var/lib/kubelet/pods/76f07be8-4068-4902-a7a6-693cd1c6966b/volumes" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.224262 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-tl222"] Dec 02 18:28:26 crc kubenswrapper[4878]: E1202 18:28:26.224598 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="registry-server" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.224612 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="registry-server" Dec 02 18:28:26 crc kubenswrapper[4878]: E1202 18:28:26.224638 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="extract-utilities" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.224644 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="extract-utilities" Dec 02 18:28:26 crc kubenswrapper[4878]: E1202 18:28:26.224655 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="extract-content" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.224662 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="extract-content" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.224785 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f07be8-4068-4902-a7a6-693cd1c6966b" containerName="registry-server" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.225357 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.231226 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-wxnnj" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.231698 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.232484 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.232689 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.232877 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.260939 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-tl222"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.397833 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgjrs\" (UniqueName: \"kubernetes.io/projected/7f86bb98-b2df-4776-97a4-7a45b69972b8-kube-api-access-hgjrs\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.398411 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.398501 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.398615 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f86bb98-b2df-4776-97a4-7a45b69972b8-config\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.398967 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.430456 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-2d5dl"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.431634 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.435084 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.435392 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.435877 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.500500 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f86bb98-b2df-4776-97a4-7a45b69972b8-config\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.500594 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.500633 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgjrs\" (UniqueName: \"kubernetes.io/projected/7f86bb98-b2df-4776-97a4-7a45b69972b8-kube-api-access-hgjrs\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.500655 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.500683 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.501821 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.502828 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f86bb98-b2df-4776-97a4-7a45b69972b8-config\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.519208 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.519765 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7f86bb98-b2df-4776-97a4-7a45b69972b8-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.529219 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgjrs\" (UniqueName: \"kubernetes.io/projected/7f86bb98-b2df-4776-97a4-7a45b69972b8-kube-api-access-hgjrs\") pod \"logging-loki-distributor-76cc67bf56-tl222\" (UID: \"7f86bb98-b2df-4776-97a4-7a45b69972b8\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.547936 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-2d5dl"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.554053 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.559423 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.560655 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.567497 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.569364 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.598176 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.601650 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx7jd\" (UniqueName: \"kubernetes.io/projected/b1de2d7e-c37c-4464-bd35-337650bd62bf-kube-api-access-tx7jd\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.601704 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1de2d7e-c37c-4464-bd35-337650bd62bf-config\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.601755 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.601850 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.601875 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.601897 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703120 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx7jd\" (UniqueName: \"kubernetes.io/projected/b1de2d7e-c37c-4464-bd35-337650bd62bf-kube-api-access-tx7jd\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703196 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/316a18ce-1717-4e23-8750-17b4ec2e553c-config\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703224 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1de2d7e-c37c-4464-bd35-337650bd62bf-config\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703292 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703315 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703337 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703338 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.704927 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1de2d7e-c37c-4464-bd35-337650bd62bf-config\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.703361 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.705023 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.705051 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t9f5\" (UniqueName: \"kubernetes.io/projected/316a18ce-1717-4e23-8750-17b4ec2e553c-kube-api-access-2t9f5\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.705082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.705100 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.705108 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.708728 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.709329 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.709661 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.719469 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b1de2d7e-c37c-4464-bd35-337650bd62bf-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.721733 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.727935 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.728754 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.730213 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx7jd\" (UniqueName: \"kubernetes.io/projected/b1de2d7e-c37c-4464-bd35-337650bd62bf-kube-api-access-tx7jd\") pod \"logging-loki-querier-5895d59bb8-2d5dl\" (UID: \"b1de2d7e-c37c-4464-bd35-337650bd62bf\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.728933 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.728997 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.729471 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.729685 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.738913 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-r2n2v" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.750159 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.750700 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.763110 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm"] Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812458 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-tls-secret\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812522 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-tenants\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812561 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812584 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812612 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812639 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-rbac\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812679 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t9f5\" (UniqueName: \"kubernetes.io/projected/316a18ce-1717-4e23-8750-17b4ec2e553c-kube-api-access-2t9f5\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812701 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812723 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812753 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-lokistack-gateway\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812785 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjjpx\" (UniqueName: \"kubernetes.io/projected/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-kube-api-access-rjjpx\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812822 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.812840 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/316a18ce-1717-4e23-8750-17b4ec2e553c-config\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.813999 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/316a18ce-1717-4e23-8750-17b4ec2e553c-config\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.821929 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.871705 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.881323 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t9f5\" (UniqueName: \"kubernetes.io/projected/316a18ce-1717-4e23-8750-17b4ec2e553c-kube-api-access-2t9f5\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.884179 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/316a18ce-1717-4e23-8750-17b4ec2e553c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-xwgzp\" (UID: \"316a18ce-1717-4e23-8750-17b4ec2e553c\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918685 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rk7x\" (UniqueName: \"kubernetes.io/projected/92b9d963-d156-45ca-89fd-3f992d10d24e-kube-api-access-6rk7x\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918760 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918795 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918826 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918855 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-tls-secret\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918891 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-tls-secret\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918917 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-tenants\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918956 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-lokistack-gateway\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918974 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.918995 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-rbac\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.919040 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.919058 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.919082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-lokistack-gateway\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.919105 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-rbac\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.919127 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjjpx\" (UniqueName: \"kubernetes.io/projected/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-kube-api-access-rjjpx\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.919145 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-tenants\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.920063 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.928491 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-lokistack-gateway\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.929284 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.942748 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-rbac\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.943935 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-tenants\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.947912 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-tls-secret\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.961623 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:26 crc kubenswrapper[4878]: I1202 18:28:26.962321 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027495 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rk7x\" (UniqueName: \"kubernetes.io/projected/92b9d963-d156-45ca-89fd-3f992d10d24e-kube-api-access-6rk7x\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027741 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027787 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-tls-secret\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027856 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027883 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-lokistack-gateway\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027957 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-rbac\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.027994 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-tenants\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.032047 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.033791 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-lokistack-gateway\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.040096 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-rbac\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.040435 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-ca-bundle\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.044714 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-tenants\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.045190 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjjpx\" (UniqueName: \"kubernetes.io/projected/6bb468bd-bb35-4a4e-b41c-ed2a8f964d77-kube-api-access-rjjpx\") pod \"logging-loki-gateway-85bc84b7b8-89mkg\" (UID: \"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.045792 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.049592 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/92b9d963-d156-45ca-89fd-3f992d10d24e-tls-secret\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.073736 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rk7x\" (UniqueName: \"kubernetes.io/projected/92b9d963-d156-45ca-89fd-3f992d10d24e-kube-api-access-6rk7x\") pod \"logging-loki-gateway-85bc84b7b8-zjfmm\" (UID: \"92b9d963-d156-45ca-89fd-3f992d10d24e\") " pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.086472 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-tl222"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.089053 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.140176 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.289390 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" event={"ID":"7f86bb98-b2df-4776-97a4-7a45b69972b8","Type":"ContainerStarted","Data":"b77588c0f1f66db0d455efbbc5719f646c28f29483df144d2744cbf05fb1a537"} Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.408893 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.409905 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.417133 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.418063 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.427856 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.493987 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-2d5dl"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.510409 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.511887 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.515079 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.517115 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.519016 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548267 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548325 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548360 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548382 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548465 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548495 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548531 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a9f06ce-f976-42a6-9393-55ac1c7ca894-config\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.548560 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7skfc\" (UniqueName: \"kubernetes.io/projected/1a9f06ce-f976-42a6-9393-55ac1c7ca894-kube-api-access-7skfc\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.624560 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.630127 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.649180 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.650630 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654641 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654675 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5cf523ea-33ef-495d-a19a-8221549d8969\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5cf523ea-33ef-495d-a19a-8221549d8969\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654694 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgxct\" (UniqueName: \"kubernetes.io/projected/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-kube-api-access-rgxct\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654721 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654741 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654793 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654844 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654868 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-config\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654904 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654940 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654974 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a9f06ce-f976-42a6-9393-55ac1c7ca894-config\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.654997 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7skfc\" (UniqueName: \"kubernetes.io/projected/1a9f06ce-f976-42a6-9393-55ac1c7ca894-kube-api-access-7skfc\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.655024 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.655049 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.655492 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.655746 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.657438 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.662192 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a9f06ce-f976-42a6-9393-55ac1c7ca894-config\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.666111 4878 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.666166 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4d2566d1953f9cbb1104e22bc11a50741db92a3a60d18e25007a9518a84350fd/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.666430 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.674671 4878 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.674745 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f2f66e4dc1f4770f27dd8a6942a7b8fd1fcf2eb6af728d79dcf3eccea4f7e916/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.674868 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.677411 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.680944 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/1a9f06ce-f976-42a6-9393-55ac1c7ca894-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.682957 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7skfc\" (UniqueName: \"kubernetes.io/projected/1a9f06ce-f976-42a6-9393-55ac1c7ca894-kube-api-access-7skfc\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.708409 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-afd37ed3-60f5-46bd-b734-4fd7cd7e3b30\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.719223 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8de97e8f-dbba-4f7d-9ed6-72c5b564b54b\") pod \"logging-loki-ingester-0\" (UID: \"1a9f06ce-f976-42a6-9393-55ac1c7ca894\") " pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.750430 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757087 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757219 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757474 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757520 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757555 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-config\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757609 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757642 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757690 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5cf523ea-33ef-495d-a19a-8221549d8969\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5cf523ea-33ef-495d-a19a-8221549d8969\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757721 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgxct\" (UniqueName: \"kubernetes.io/projected/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-kube-api-access-rgxct\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757773 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6cf23f-200e-43b7-81ea-b13382391ad0-config\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757811 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757869 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757905 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.757945 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-774b7\" (UniqueName: \"kubernetes.io/projected/ef6cf23f-200e-43b7-81ea-b13382391ad0-kube-api-access-774b7\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.759623 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-config\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.760008 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.763543 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.763623 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg"] Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.763637 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.764434 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.781654 4878 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.781703 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5cf523ea-33ef-495d-a19a-8221549d8969\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5cf523ea-33ef-495d-a19a-8221549d8969\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6aa5262fa4612324b05024a843a66262632c449c820272fd84d1b1004340e43f/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.785575 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgxct\" (UniqueName: \"kubernetes.io/projected/3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a-kube-api-access-rgxct\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.817007 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5cf523ea-33ef-495d-a19a-8221549d8969\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5cf523ea-33ef-495d-a19a-8221549d8969\") pod \"logging-loki-compactor-0\" (UID: \"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a\") " pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.859853 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.859904 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.859926 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.859966 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6cf23f-200e-43b7-81ea-b13382391ad0-config\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.860010 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.860033 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.860052 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-774b7\" (UniqueName: \"kubernetes.io/projected/ef6cf23f-200e-43b7-81ea-b13382391ad0-kube-api-access-774b7\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.862197 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.862974 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6cf23f-200e-43b7-81ea-b13382391ad0-config\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.864644 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.866281 4878 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.866312 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/73cbb9bc47e713ca18822c36bfeaa227be4850f5e626baf38a9d6fdf907ce1a9/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.869137 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.877943 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/ef6cf23f-200e-43b7-81ea-b13382391ad0-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.880750 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.884117 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-774b7\" (UniqueName: \"kubernetes.io/projected/ef6cf23f-200e-43b7-81ea-b13382391ad0-kube-api-access-774b7\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:27 crc kubenswrapper[4878]: I1202 18:28:27.901061 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a075a7d-f57a-40d7-9a88-3f0ae7380b9a\") pod \"logging-loki-index-gateway-0\" (UID: \"ef6cf23f-200e-43b7-81ea-b13382391ad0\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.058967 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.150141 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 02 18:28:28 crc kubenswrapper[4878]: W1202 18:28:28.160784 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3abc051e_1bf7_4ac4_aa57_cb804ed0ff9a.slice/crio-de7479d437f94418cdcaf535ec8524793075797828daef55eca1e88fcd0d6c9c WatchSource:0}: Error finding container de7479d437f94418cdcaf535ec8524793075797828daef55eca1e88fcd0d6c9c: Status 404 returned error can't find the container with id de7479d437f94418cdcaf535ec8524793075797828daef55eca1e88fcd0d6c9c Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.202867 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 02 18:28:28 crc kubenswrapper[4878]: W1202 18:28:28.208060 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a9f06ce_f976_42a6_9393_55ac1c7ca894.slice/crio-f509630334f9004425ba1df35daf929101b25216c804a9f3363066b965dc7e41 WatchSource:0}: Error finding container f509630334f9004425ba1df35daf929101b25216c804a9f3363066b965dc7e41: Status 404 returned error can't find the container with id f509630334f9004425ba1df35daf929101b25216c804a9f3363066b965dc7e41 Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.295966 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.299715 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" event={"ID":"316a18ce-1717-4e23-8750-17b4ec2e553c","Type":"ContainerStarted","Data":"ef618995473e3b903d4f30e22bc6708e1ba92000dde03e9c2350943a41bea805"} Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.305063 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" event={"ID":"b1de2d7e-c37c-4464-bd35-337650bd62bf","Type":"ContainerStarted","Data":"cd94eea6e4274e3c753cec048b8fe88f19c010ed3f7b9d504b22a41ffc73db42"} Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.307470 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a","Type":"ContainerStarted","Data":"de7479d437f94418cdcaf535ec8524793075797828daef55eca1e88fcd0d6c9c"} Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.308744 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" event={"ID":"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77","Type":"ContainerStarted","Data":"5f9c796f820ed3960f55eaf18bbcc4a50fc054894315b8a4db29c0e3fb8c2319"} Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.310023 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"1a9f06ce-f976-42a6-9393-55ac1c7ca894","Type":"ContainerStarted","Data":"f509630334f9004425ba1df35daf929101b25216c804a9f3363066b965dc7e41"} Dec 02 18:28:28 crc kubenswrapper[4878]: I1202 18:28:28.320789 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" event={"ID":"92b9d963-d156-45ca-89fd-3f992d10d24e","Type":"ContainerStarted","Data":"21974a7ce2d06ac718253538008b6a67e7d8b6012b0310cf8e46b8808403cc0d"} Dec 02 18:28:29 crc kubenswrapper[4878]: I1202 18:28:29.340622 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"ef6cf23f-200e-43b7-81ea-b13382391ad0","Type":"ContainerStarted","Data":"99f795751d3d7dfca58a77d21a6eba44e4e87e0d49eed89353001aee97c1e431"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.368332 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" event={"ID":"b1de2d7e-c37c-4464-bd35-337650bd62bf","Type":"ContainerStarted","Data":"82d848de40f995fcc71286b161607dfcdb26edcd59833f79b8a40fb7cec586f5"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.369244 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.371275 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a","Type":"ContainerStarted","Data":"85fb8beca5044726f757817b7e62cee4ef561373ad57436a0b66d462f6993c08"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.372211 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.374357 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" event={"ID":"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77","Type":"ContainerStarted","Data":"b56fb065185abaaa5bd926fccd7dfde5d90cd175f5a4bbafec613d2b7bf01987"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.375793 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"1a9f06ce-f976-42a6-9393-55ac1c7ca894","Type":"ContainerStarted","Data":"4fcde7387f7ee5437ef9569c308b8a3c0d8e5ed2b8077390c53377375ec7ad36"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.375953 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.378021 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"ef6cf23f-200e-43b7-81ea-b13382391ad0","Type":"ContainerStarted","Data":"50969b99f6c82613f57a4a8221cc8e761568471f1e30dd14be0487c151606b1b"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.378231 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.379611 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" event={"ID":"92b9d963-d156-45ca-89fd-3f992d10d24e","Type":"ContainerStarted","Data":"61469e0cc5dccfd5de8bd8208788a57d5fd01b8be6c9992eb0c71d35ac0a447a"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.381454 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" event={"ID":"316a18ce-1717-4e23-8750-17b4ec2e553c","Type":"ContainerStarted","Data":"d4bcc9bc97f7162625e14b18b5bb863549d73b7f1b0d1f0242d45deca93b1745"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.381665 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.383112 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" event={"ID":"7f86bb98-b2df-4776-97a4-7a45b69972b8","Type":"ContainerStarted","Data":"d3030ca74addfef9e5a6eb7a9ea2a99d64567458537b9a874362451beb0e5d44"} Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.383312 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.389254 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" podStartSLOduration=2.30579536 podStartE2EDuration="6.389228546s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:27.494455697 +0000 UTC m=+817.184074578" lastFinishedPulling="2025-12-02 18:28:31.577888883 +0000 UTC m=+821.267507764" observedRunningTime="2025-12-02 18:28:32.387971817 +0000 UTC m=+822.077590688" watchObservedRunningTime="2025-12-02 18:28:32.389228546 +0000 UTC m=+822.078847427" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.413791 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" podStartSLOduration=2.448073123 podStartE2EDuration="6.413762734s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:27.654872707 +0000 UTC m=+817.344491588" lastFinishedPulling="2025-12-02 18:28:31.620562318 +0000 UTC m=+821.310181199" observedRunningTime="2025-12-02 18:28:32.412169234 +0000 UTC m=+822.101788125" watchObservedRunningTime="2025-12-02 18:28:32.413762734 +0000 UTC m=+822.103381615" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.432924 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.08412163 podStartE2EDuration="6.432894823s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:28.304774248 +0000 UTC m=+817.994393129" lastFinishedPulling="2025-12-02 18:28:31.653547441 +0000 UTC m=+821.343166322" observedRunningTime="2025-12-02 18:28:32.431819929 +0000 UTC m=+822.121438810" watchObservedRunningTime="2025-12-02 18:28:32.432894823 +0000 UTC m=+822.122513694" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.454841 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.000815554 podStartE2EDuration="6.454809699s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:28.165461188 +0000 UTC m=+817.855080069" lastFinishedPulling="2025-12-02 18:28:31.619455333 +0000 UTC m=+821.309074214" observedRunningTime="2025-12-02 18:28:32.450682439 +0000 UTC m=+822.140301320" watchObservedRunningTime="2025-12-02 18:28:32.454809699 +0000 UTC m=+822.144428580" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.469405 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" podStartSLOduration=1.93896308 podStartE2EDuration="6.469370964s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:27.124620302 +0000 UTC m=+816.814239183" lastFinishedPulling="2025-12-02 18:28:31.655028186 +0000 UTC m=+821.344647067" observedRunningTime="2025-12-02 18:28:32.467090033 +0000 UTC m=+822.156708924" watchObservedRunningTime="2025-12-02 18:28:32.469370964 +0000 UTC m=+822.158989845" Dec 02 18:28:32 crc kubenswrapper[4878]: I1202 18:28:32.491386 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.080336042 podStartE2EDuration="6.491356533s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:28.211294672 +0000 UTC m=+817.900913553" lastFinishedPulling="2025-12-02 18:28:31.622315173 +0000 UTC m=+821.311934044" observedRunningTime="2025-12-02 18:28:32.489545386 +0000 UTC m=+822.179164277" watchObservedRunningTime="2025-12-02 18:28:32.491356533 +0000 UTC m=+822.180975434" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.407691 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" event={"ID":"92b9d963-d156-45ca-89fd-3f992d10d24e","Type":"ContainerStarted","Data":"295fc0e2bb06b45c9bbebccded220e574ca87175b8fe537dd43a4bf3e476ab0a"} Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.408662 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.411577 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" event={"ID":"6bb468bd-bb35-4a4e-b41c-ed2a8f964d77","Type":"ContainerStarted","Data":"324d1aae4c169852d4198ebda97060552ff18c7df2e515deb70d915f8b650142"} Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.413348 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.413374 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.445503 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.446114 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.448392 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.452428 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" podStartSLOduration=2.772541109 podStartE2EDuration="9.452406719s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:27.64951444 +0000 UTC m=+817.339133311" lastFinishedPulling="2025-12-02 18:28:34.32938004 +0000 UTC m=+824.018998921" observedRunningTime="2025-12-02 18:28:35.434293242 +0000 UTC m=+825.123912133" watchObservedRunningTime="2025-12-02 18:28:35.452406719 +0000 UTC m=+825.142025600" Dec 02 18:28:35 crc kubenswrapper[4878]: I1202 18:28:35.488168 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-89mkg" podStartSLOduration=2.947789844 podStartE2EDuration="9.488137967s" podCreationTimestamp="2025-12-02 18:28:26 +0000 UTC" firstStartedPulling="2025-12-02 18:28:27.782782851 +0000 UTC m=+817.472401732" lastFinishedPulling="2025-12-02 18:28:34.323130974 +0000 UTC m=+824.012749855" observedRunningTime="2025-12-02 18:28:35.466995206 +0000 UTC m=+825.156614077" watchObservedRunningTime="2025-12-02 18:28:35.488137967 +0000 UTC m=+825.177756848" Dec 02 18:28:36 crc kubenswrapper[4878]: I1202 18:28:36.420663 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:36 crc kubenswrapper[4878]: I1202 18:28:36.430922 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-85bc84b7b8-zjfmm" Dec 02 18:28:47 crc kubenswrapper[4878]: I1202 18:28:47.757274 4878 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 02 18:28:47 crc kubenswrapper[4878]: I1202 18:28:47.758102 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="1a9f06ce-f976-42a6-9393-55ac1c7ca894" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 18:28:47 crc kubenswrapper[4878]: I1202 18:28:47.893389 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 02 18:28:48 crc kubenswrapper[4878]: I1202 18:28:48.175477 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 02 18:28:53 crc kubenswrapper[4878]: I1202 18:28:53.742286 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:28:53 crc kubenswrapper[4878]: I1202 18:28:53.743066 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:28:53 crc kubenswrapper[4878]: I1202 18:28:53.743140 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:28:53 crc kubenswrapper[4878]: I1202 18:28:53.744350 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b46b425128f8cb8574d53391fe3090841c533ef0911e243412874ecbe8a5c8b9"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:28:53 crc kubenswrapper[4878]: I1202 18:28:53.744449 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://b46b425128f8cb8574d53391fe3090841c533ef0911e243412874ecbe8a5c8b9" gracePeriod=600 Dec 02 18:28:54 crc kubenswrapper[4878]: I1202 18:28:54.040172 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="b46b425128f8cb8574d53391fe3090841c533ef0911e243412874ecbe8a5c8b9" exitCode=0 Dec 02 18:28:54 crc kubenswrapper[4878]: I1202 18:28:54.040298 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"b46b425128f8cb8574d53391fe3090841c533ef0911e243412874ecbe8a5c8b9"} Dec 02 18:28:54 crc kubenswrapper[4878]: I1202 18:28:54.040852 4878 scope.go:117] "RemoveContainer" containerID="3a28b486a2e75984b9b969c8ed5539eb39300fac74b0f7f88830322d2c2039ac" Dec 02 18:28:55 crc kubenswrapper[4878]: I1202 18:28:55.055497 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"e692d5eeca0be5391ffb074305f4ee4fcb35693cb015b4c1a01e012767df5a57"} Dec 02 18:28:56 crc kubenswrapper[4878]: I1202 18:28:56.567084 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-tl222" Dec 02 18:28:56 crc kubenswrapper[4878]: I1202 18:28:56.770943 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-2d5dl" Dec 02 18:28:56 crc kubenswrapper[4878]: I1202 18:28:56.969317 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-xwgzp" Dec 02 18:28:57 crc kubenswrapper[4878]: I1202 18:28:57.758207 4878 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 02 18:28:57 crc kubenswrapper[4878]: I1202 18:28:57.758305 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="1a9f06ce-f976-42a6-9393-55ac1c7ca894" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 18:29:07 crc kubenswrapper[4878]: I1202 18:29:07.761778 4878 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 02 18:29:07 crc kubenswrapper[4878]: I1202 18:29:07.762868 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="1a9f06ce-f976-42a6-9393-55ac1c7ca894" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 18:29:17 crc kubenswrapper[4878]: I1202 18:29:17.757204 4878 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 02 18:29:17 crc kubenswrapper[4878]: I1202 18:29:17.758181 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="1a9f06ce-f976-42a6-9393-55ac1c7ca894" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.640847 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmq6c"] Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.644986 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.656310 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmq6c"] Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.737443 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-utilities\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.737544 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4n6h\" (UniqueName: \"kubernetes.io/projected/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-kube-api-access-j4n6h\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.737907 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-catalog-content\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.757273 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.839039 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-catalog-content\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.839142 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-utilities\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.839173 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4n6h\" (UniqueName: \"kubernetes.io/projected/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-kube-api-access-j4n6h\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.839955 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-catalog-content\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.839976 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-utilities\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.868368 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4n6h\" (UniqueName: \"kubernetes.io/projected/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-kube-api-access-j4n6h\") pod \"community-operators-wmq6c\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:27 crc kubenswrapper[4878]: I1202 18:29:27.990661 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:28 crc kubenswrapper[4878]: I1202 18:29:28.300508 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmq6c"] Dec 02 18:29:28 crc kubenswrapper[4878]: I1202 18:29:28.363085 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmq6c" event={"ID":"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3","Type":"ContainerStarted","Data":"3c28913402c88c95dded66179b67e43221a09b78f0b15ec57663176aabb10299"} Dec 02 18:29:29 crc kubenswrapper[4878]: I1202 18:29:29.370880 4878 generic.go:334] "Generic (PLEG): container finished" podID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerID="260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0" exitCode=0 Dec 02 18:29:29 crc kubenswrapper[4878]: I1202 18:29:29.370937 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmq6c" event={"ID":"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3","Type":"ContainerDied","Data":"260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0"} Dec 02 18:29:30 crc kubenswrapper[4878]: I1202 18:29:30.380965 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmq6c" event={"ID":"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3","Type":"ContainerStarted","Data":"83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9"} Dec 02 18:29:31 crc kubenswrapper[4878]: I1202 18:29:31.395716 4878 generic.go:334] "Generic (PLEG): container finished" podID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerID="83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9" exitCode=0 Dec 02 18:29:31 crc kubenswrapper[4878]: I1202 18:29:31.395796 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmq6c" event={"ID":"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3","Type":"ContainerDied","Data":"83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9"} Dec 02 18:29:32 crc kubenswrapper[4878]: I1202 18:29:32.406822 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmq6c" event={"ID":"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3","Type":"ContainerStarted","Data":"608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac"} Dec 02 18:29:32 crc kubenswrapper[4878]: I1202 18:29:32.437988 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmq6c" podStartSLOduration=2.962064314 podStartE2EDuration="5.437966735s" podCreationTimestamp="2025-12-02 18:29:27 +0000 UTC" firstStartedPulling="2025-12-02 18:29:29.372580203 +0000 UTC m=+879.062199094" lastFinishedPulling="2025-12-02 18:29:31.848482624 +0000 UTC m=+881.538101515" observedRunningTime="2025-12-02 18:29:32.431364318 +0000 UTC m=+882.120983209" watchObservedRunningTime="2025-12-02 18:29:32.437966735 +0000 UTC m=+882.127585616" Dec 02 18:29:37 crc kubenswrapper[4878]: I1202 18:29:37.991439 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:37 crc kubenswrapper[4878]: I1202 18:29:37.992321 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:38 crc kubenswrapper[4878]: I1202 18:29:38.055464 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:38 crc kubenswrapper[4878]: I1202 18:29:38.515077 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:38 crc kubenswrapper[4878]: I1202 18:29:38.574682 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmq6c"] Dec 02 18:29:40 crc kubenswrapper[4878]: I1202 18:29:40.473764 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmq6c" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="registry-server" containerID="cri-o://608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac" gracePeriod=2 Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.434489 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.487026 4878 generic.go:334] "Generic (PLEG): container finished" podID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerID="608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac" exitCode=0 Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.487079 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmq6c" event={"ID":"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3","Type":"ContainerDied","Data":"608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac"} Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.487110 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmq6c" event={"ID":"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3","Type":"ContainerDied","Data":"3c28913402c88c95dded66179b67e43221a09b78f0b15ec57663176aabb10299"} Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.487130 4878 scope.go:117] "RemoveContainer" containerID="608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.487302 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmq6c" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.530909 4878 scope.go:117] "RemoveContainer" containerID="83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.580544 4878 scope.go:117] "RemoveContainer" containerID="260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.595560 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-catalog-content\") pod \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.595691 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4n6h\" (UniqueName: \"kubernetes.io/projected/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-kube-api-access-j4n6h\") pod \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.595804 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-utilities\") pod \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\" (UID: \"8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3\") " Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.596921 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-utilities" (OuterVolumeSpecName: "utilities") pod "8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" (UID: "8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.605286 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-kube-api-access-j4n6h" (OuterVolumeSpecName: "kube-api-access-j4n6h") pod "8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" (UID: "8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3"). InnerVolumeSpecName "kube-api-access-j4n6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.618466 4878 scope.go:117] "RemoveContainer" containerID="608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac" Dec 02 18:29:41 crc kubenswrapper[4878]: E1202 18:29:41.622299 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac\": container with ID starting with 608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac not found: ID does not exist" containerID="608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.622354 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac"} err="failed to get container status \"608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac\": rpc error: code = NotFound desc = could not find container \"608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac\": container with ID starting with 608335aebb640293df0ddfe0b3c6c50c9eac4f265929f2c3aab3a9dc529407ac not found: ID does not exist" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.622381 4878 scope.go:117] "RemoveContainer" containerID="83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9" Dec 02 18:29:41 crc kubenswrapper[4878]: E1202 18:29:41.622818 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9\": container with ID starting with 83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9 not found: ID does not exist" containerID="83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.622864 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9"} err="failed to get container status \"83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9\": rpc error: code = NotFound desc = could not find container \"83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9\": container with ID starting with 83b9307456917b8a9340f8148ea78feef6c379f98659be8d12a28616dca440b9 not found: ID does not exist" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.622880 4878 scope.go:117] "RemoveContainer" containerID="260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0" Dec 02 18:29:41 crc kubenswrapper[4878]: E1202 18:29:41.624279 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0\": container with ID starting with 260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0 not found: ID does not exist" containerID="260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.624348 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0"} err="failed to get container status \"260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0\": rpc error: code = NotFound desc = could not find container \"260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0\": container with ID starting with 260bb1a76d73043fc6dc621976c111903d60d6ab07fa3ef8104f1d9038c623a0 not found: ID does not exist" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.650473 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" (UID: "8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.698220 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.698269 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.698280 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4n6h\" (UniqueName: \"kubernetes.io/projected/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3-kube-api-access-j4n6h\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.823802 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmq6c"] Dec 02 18:29:41 crc kubenswrapper[4878]: I1202 18:29:41.830319 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmq6c"] Dec 02 18:29:42 crc kubenswrapper[4878]: I1202 18:29:42.949262 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" path="/var/lib/kubelet/pods/8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3/volumes" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.882044 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-mpvkj"] Dec 02 18:29:47 crc kubenswrapper[4878]: E1202 18:29:47.890621 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="registry-server" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.890643 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="registry-server" Dec 02 18:29:47 crc kubenswrapper[4878]: E1202 18:29:47.890654 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="extract-utilities" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.890661 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="extract-utilities" Dec 02 18:29:47 crc kubenswrapper[4878]: E1202 18:29:47.890682 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="extract-content" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.890689 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="extract-content" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.890809 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dbc8ec9-b2f9-4dd8-99c3-e512962cddc3" containerName="registry-server" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.891414 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mpvkj" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.895252 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.896759 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.897474 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.897659 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.898206 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-ncqkj" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.915525 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.919828 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-mpvkj"] Dec 02 18:29:47 crc kubenswrapper[4878]: I1202 18:29:47.975608 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-mpvkj"] Dec 02 18:29:47 crc kubenswrapper[4878]: E1202 18:29:47.976462 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-mnt5q metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-mpvkj" podUID="c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.019952 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-tmp\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020024 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-entrypoint\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020069 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-token\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020102 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-sa-token\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020126 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-datadir\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020163 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnt5q\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-kube-api-access-mnt5q\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020196 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-trusted-ca\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020263 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config-openshift-service-cacrt\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020292 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-syslog-receiver\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020772 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-metrics\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.020881 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122314 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-tmp\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122394 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-entrypoint\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122433 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-token\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122468 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-sa-token\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122494 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-datadir\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122520 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnt5q\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-kube-api-access-mnt5q\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122570 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-trusted-ca\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122614 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config-openshift-service-cacrt\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122642 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-syslog-receiver\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122688 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-metrics\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.122716 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.123753 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-datadir\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.123815 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.124911 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-trusted-ca\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.125885 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config-openshift-service-cacrt\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.126773 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-entrypoint\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.130835 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-metrics\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.141140 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-token\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.141163 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-syslog-receiver\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.142824 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnt5q\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-kube-api-access-mnt5q\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.130522 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-tmp\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.160631 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-sa-token\") pod \"collector-mpvkj\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.553534 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.565526 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mpvkj" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.735626 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-metrics\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.735719 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-sa-token\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.735751 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-token\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.735790 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnt5q\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-kube-api-access-mnt5q\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.735869 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-trusted-ca\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.735939 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-datadir\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.735982 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-entrypoint\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736026 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-syslog-receiver\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736108 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-tmp\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736159 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736207 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config-openshift-service-cacrt\") pod \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\" (UID: \"c2ba9fc3-f6cd-43a9-b70f-bac324a726a5\") " Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736503 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736573 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-datadir" (OuterVolumeSpecName: "datadir") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736659 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736678 4878 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-datadir\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.736898 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.737289 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config" (OuterVolumeSpecName: "config") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.737356 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.740096 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-kube-api-access-mnt5q" (OuterVolumeSpecName: "kube-api-access-mnt5q") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "kube-api-access-mnt5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.740522 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-token" (OuterVolumeSpecName: "collector-token") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.740571 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-metrics" (OuterVolumeSpecName: "metrics") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.740622 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-sa-token" (OuterVolumeSpecName: "sa-token") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.741461 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.743954 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-tmp" (OuterVolumeSpecName: "tmp") pod "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" (UID: "c2ba9fc3-f6cd-43a9-b70f-bac324a726a5"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838795 4878 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838846 4878 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838863 4878 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-tmp\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838874 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838889 4878 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838903 4878 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838920 4878 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838931 4878 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-collector-token\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:48 crc kubenswrapper[4878]: I1202 18:29:48.838946 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnt5q\" (UniqueName: \"kubernetes.io/projected/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5-kube-api-access-mnt5q\") on node \"crc\" DevicePath \"\"" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.561706 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mpvkj" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.661657 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-mpvkj"] Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.673173 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-jjcjt"] Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.678561 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.680658 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-mpvkj"] Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.683275 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.683933 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.684847 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-ncqkj" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.685531 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.685688 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.695188 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.700750 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-jjcjt"] Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.859425 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-entrypoint\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.859493 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/089919f9-be76-4956-a7d3-92fa66aa19ef-datadir\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.859520 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-config-openshift-service-cacrt\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.859548 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-collector-syslog-receiver\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.859798 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-metrics\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.860011 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmvl\" (UniqueName: \"kubernetes.io/projected/089919f9-be76-4956-a7d3-92fa66aa19ef-kube-api-access-grmvl\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.860102 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-trusted-ca\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.860232 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/089919f9-be76-4956-a7d3-92fa66aa19ef-sa-token\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.860342 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/089919f9-be76-4956-a7d3-92fa66aa19ef-tmp\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.860377 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-collector-token\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.860404 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-config\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.961801 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-entrypoint\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.961891 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/089919f9-be76-4956-a7d3-92fa66aa19ef-datadir\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.961956 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-config-openshift-service-cacrt\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962004 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-collector-syslog-receiver\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962034 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-metrics\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962121 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grmvl\" (UniqueName: \"kubernetes.io/projected/089919f9-be76-4956-a7d3-92fa66aa19ef-kube-api-access-grmvl\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962150 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-trusted-ca\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962168 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/089919f9-be76-4956-a7d3-92fa66aa19ef-datadir\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962417 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/089919f9-be76-4956-a7d3-92fa66aa19ef-sa-token\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962501 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/089919f9-be76-4956-a7d3-92fa66aa19ef-tmp\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962529 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-collector-token\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.962577 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-config\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.963157 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-entrypoint\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.963306 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-config-openshift-service-cacrt\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.963838 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-config\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.964826 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/089919f9-be76-4956-a7d3-92fa66aa19ef-trusted-ca\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.970127 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-metrics\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.970144 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-collector-token\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.972598 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/089919f9-be76-4956-a7d3-92fa66aa19ef-tmp\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.974838 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/089919f9-be76-4956-a7d3-92fa66aa19ef-collector-syslog-receiver\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.980718 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/089919f9-be76-4956-a7d3-92fa66aa19ef-sa-token\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:49 crc kubenswrapper[4878]: I1202 18:29:49.983981 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmvl\" (UniqueName: \"kubernetes.io/projected/089919f9-be76-4956-a7d3-92fa66aa19ef-kube-api-access-grmvl\") pod \"collector-jjcjt\" (UID: \"089919f9-be76-4956-a7d3-92fa66aa19ef\") " pod="openshift-logging/collector-jjcjt" Dec 02 18:29:50 crc kubenswrapper[4878]: I1202 18:29:50.004264 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-jjcjt" Dec 02 18:29:50 crc kubenswrapper[4878]: I1202 18:29:50.451509 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-jjcjt"] Dec 02 18:29:50 crc kubenswrapper[4878]: I1202 18:29:50.572225 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-jjcjt" event={"ID":"089919f9-be76-4956-a7d3-92fa66aa19ef","Type":"ContainerStarted","Data":"72768e0f24b0e7dbbe2c51a2e65d4d4734448c97932a033de4a8b75293e6daf4"} Dec 02 18:29:50 crc kubenswrapper[4878]: I1202 18:29:50.961584 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ba9fc3-f6cd-43a9-b70f-bac324a726a5" path="/var/lib/kubelet/pods/c2ba9fc3-f6cd-43a9-b70f-bac324a726a5/volumes" Dec 02 18:29:58 crc kubenswrapper[4878]: I1202 18:29:58.652745 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-jjcjt" event={"ID":"089919f9-be76-4956-a7d3-92fa66aa19ef","Type":"ContainerStarted","Data":"4fcfc87ba153fc047bee4d34f333d0a912a9d9d7e8d4cd3010bcb6e16f77ce57"} Dec 02 18:29:58 crc kubenswrapper[4878]: I1202 18:29:58.680072 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-jjcjt" podStartSLOduration=2.377704385 podStartE2EDuration="9.680043683s" podCreationTimestamp="2025-12-02 18:29:49 +0000 UTC" firstStartedPulling="2025-12-02 18:29:50.460927016 +0000 UTC m=+900.150545887" lastFinishedPulling="2025-12-02 18:29:57.763266284 +0000 UTC m=+907.452885185" observedRunningTime="2025-12-02 18:29:58.677782194 +0000 UTC m=+908.367401075" watchObservedRunningTime="2025-12-02 18:29:58.680043683 +0000 UTC m=+908.369662604" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.154233 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg"] Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.155606 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.158410 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.158421 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.176417 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg"] Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.277264 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-secret-volume\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.277600 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-config-volume\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.277731 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj6tc\" (UniqueName: \"kubernetes.io/projected/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-kube-api-access-sj6tc\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.379693 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-secret-volume\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.381152 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-config-volume\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.381344 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj6tc\" (UniqueName: \"kubernetes.io/projected/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-kube-api-access-sj6tc\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.382364 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-config-volume\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.399364 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-secret-volume\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.405136 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj6tc\" (UniqueName: \"kubernetes.io/projected/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-kube-api-access-sj6tc\") pod \"collect-profiles-29411670-2bvmg\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.492436 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:00 crc kubenswrapper[4878]: I1202 18:30:00.973767 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg"] Dec 02 18:30:01 crc kubenswrapper[4878]: I1202 18:30:01.682712 4878 generic.go:334] "Generic (PLEG): container finished" podID="06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" containerID="ce97aa90ac3b941a39d3d0acd25361726db6928174291f6db5987db6446e59e8" exitCode=0 Dec 02 18:30:01 crc kubenswrapper[4878]: I1202 18:30:01.683213 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" event={"ID":"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72","Type":"ContainerDied","Data":"ce97aa90ac3b941a39d3d0acd25361726db6928174291f6db5987db6446e59e8"} Dec 02 18:30:01 crc kubenswrapper[4878]: I1202 18:30:01.683283 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" event={"ID":"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72","Type":"ContainerStarted","Data":"1c27328b2b33d460097f1537d5983892ff53df3b981e3c23bcdaad6f1e9c415e"} Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.133150 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.232181 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-secret-volume\") pod \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.232708 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj6tc\" (UniqueName: \"kubernetes.io/projected/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-kube-api-access-sj6tc\") pod \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.232911 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-config-volume\") pod \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\" (UID: \"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72\") " Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.234050 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-config-volume" (OuterVolumeSpecName: "config-volume") pod "06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" (UID: "06cccc00-e8b4-4afd-9a11-17c4ad2b3a72"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.234758 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.242599 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-kube-api-access-sj6tc" (OuterVolumeSpecName: "kube-api-access-sj6tc") pod "06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" (UID: "06cccc00-e8b4-4afd-9a11-17c4ad2b3a72"). InnerVolumeSpecName "kube-api-access-sj6tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.255734 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" (UID: "06cccc00-e8b4-4afd-9a11-17c4ad2b3a72"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.336834 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.336880 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj6tc\" (UniqueName: \"kubernetes.io/projected/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72-kube-api-access-sj6tc\") on node \"crc\" DevicePath \"\"" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.714953 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" event={"ID":"06cccc00-e8b4-4afd-9a11-17c4ad2b3a72","Type":"ContainerDied","Data":"1c27328b2b33d460097f1537d5983892ff53df3b981e3c23bcdaad6f1e9c415e"} Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.715537 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c27328b2b33d460097f1537d5983892ff53df3b981e3c23bcdaad6f1e9c415e" Dec 02 18:30:03 crc kubenswrapper[4878]: I1202 18:30:03.715020 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.360155 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs"] Dec 02 18:30:26 crc kubenswrapper[4878]: E1202 18:30:26.361057 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" containerName="collect-profiles" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.361070 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" containerName="collect-profiles" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.361256 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" containerName="collect-profiles" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.362297 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.366177 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.375731 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs"] Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.396469 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg7cr\" (UniqueName: \"kubernetes.io/projected/2a985616-194f-418e-ba7c-ee6fe105df8c-kube-api-access-vg7cr\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.396536 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.396576 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.499713 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg7cr\" (UniqueName: \"kubernetes.io/projected/2a985616-194f-418e-ba7c-ee6fe105df8c-kube-api-access-vg7cr\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.499799 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.499898 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.500593 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.500690 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.527829 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg7cr\" (UniqueName: \"kubernetes.io/projected/2a985616-194f-418e-ba7c-ee6fe105df8c-kube-api-access-vg7cr\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:26 crc kubenswrapper[4878]: I1202 18:30:26.688836 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:27 crc kubenswrapper[4878]: I1202 18:30:27.156269 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs"] Dec 02 18:30:27 crc kubenswrapper[4878]: I1202 18:30:27.937894 4878 generic.go:334] "Generic (PLEG): container finished" podID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerID="c53a2f8670d686518c157c782c299d2a286b931c43dab38f9841d404903571d4" exitCode=0 Dec 02 18:30:27 crc kubenswrapper[4878]: I1202 18:30:27.938011 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" event={"ID":"2a985616-194f-418e-ba7c-ee6fe105df8c","Type":"ContainerDied","Data":"c53a2f8670d686518c157c782c299d2a286b931c43dab38f9841d404903571d4"} Dec 02 18:30:27 crc kubenswrapper[4878]: I1202 18:30:27.938347 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" event={"ID":"2a985616-194f-418e-ba7c-ee6fe105df8c","Type":"ContainerStarted","Data":"fa4a7fa845f4322315f7ab83ceb40f0a453f1db3d051b616f806cae278a9568f"} Dec 02 18:30:29 crc kubenswrapper[4878]: I1202 18:30:29.956972 4878 generic.go:334] "Generic (PLEG): container finished" podID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerID="e537ff4b1b3fedbac3514377208750e595fdc2920a660db6a7f8a64a36a3bad5" exitCode=0 Dec 02 18:30:29 crc kubenswrapper[4878]: I1202 18:30:29.957137 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" event={"ID":"2a985616-194f-418e-ba7c-ee6fe105df8c","Type":"ContainerDied","Data":"e537ff4b1b3fedbac3514377208750e595fdc2920a660db6a7f8a64a36a3bad5"} Dec 02 18:30:30 crc kubenswrapper[4878]: I1202 18:30:30.968451 4878 generic.go:334] "Generic (PLEG): container finished" podID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerID="ef7e381704f8174e6fc17b2d098f0bf9eb660f1b1431661ce8e0b1ea3c94bc1e" exitCode=0 Dec 02 18:30:30 crc kubenswrapper[4878]: I1202 18:30:30.968499 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" event={"ID":"2a985616-194f-418e-ba7c-ee6fe105df8c","Type":"ContainerDied","Data":"ef7e381704f8174e6fc17b2d098f0bf9eb660f1b1431661ce8e0b1ea3c94bc1e"} Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.342392 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.430986 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-util\") pod \"2a985616-194f-418e-ba7c-ee6fe105df8c\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.431061 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-bundle\") pod \"2a985616-194f-418e-ba7c-ee6fe105df8c\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.431102 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg7cr\" (UniqueName: \"kubernetes.io/projected/2a985616-194f-418e-ba7c-ee6fe105df8c-kube-api-access-vg7cr\") pod \"2a985616-194f-418e-ba7c-ee6fe105df8c\" (UID: \"2a985616-194f-418e-ba7c-ee6fe105df8c\") " Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.431817 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-bundle" (OuterVolumeSpecName: "bundle") pod "2a985616-194f-418e-ba7c-ee6fe105df8c" (UID: "2a985616-194f-418e-ba7c-ee6fe105df8c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.438704 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a985616-194f-418e-ba7c-ee6fe105df8c-kube-api-access-vg7cr" (OuterVolumeSpecName: "kube-api-access-vg7cr") pod "2a985616-194f-418e-ba7c-ee6fe105df8c" (UID: "2a985616-194f-418e-ba7c-ee6fe105df8c"). InnerVolumeSpecName "kube-api-access-vg7cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.445226 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-util" (OuterVolumeSpecName: "util") pod "2a985616-194f-418e-ba7c-ee6fe105df8c" (UID: "2a985616-194f-418e-ba7c-ee6fe105df8c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.533413 4878 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-util\") on node \"crc\" DevicePath \"\"" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.533461 4878 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a985616-194f-418e-ba7c-ee6fe105df8c-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.533475 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg7cr\" (UniqueName: \"kubernetes.io/projected/2a985616-194f-418e-ba7c-ee6fe105df8c-kube-api-access-vg7cr\") on node \"crc\" DevicePath \"\"" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.987159 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" event={"ID":"2a985616-194f-418e-ba7c-ee6fe105df8c","Type":"ContainerDied","Data":"fa4a7fa845f4322315f7ab83ceb40f0a453f1db3d051b616f806cae278a9568f"} Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.987211 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa4a7fa845f4322315f7ab83ceb40f0a453f1db3d051b616f806cae278a9568f" Dec 02 18:30:32 crc kubenswrapper[4878]: I1202 18:30:32.987426 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.047473 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq"] Dec 02 18:30:38 crc kubenswrapper[4878]: E1202 18:30:38.048854 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerName="extract" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.048875 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerName="extract" Dec 02 18:30:38 crc kubenswrapper[4878]: E1202 18:30:38.048892 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerName="pull" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.048900 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerName="pull" Dec 02 18:30:38 crc kubenswrapper[4878]: E1202 18:30:38.048910 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerName="util" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.048918 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerName="util" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.049126 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a985616-194f-418e-ba7c-ee6fe105df8c" containerName="extract" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.050041 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.056084 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.056741 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tzzgr" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.056763 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.072909 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq"] Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.135829 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n7m5\" (UniqueName: \"kubernetes.io/projected/e9a04710-7320-4cde-9863-e05e65f54671-kube-api-access-8n7m5\") pod \"nmstate-operator-5b5b58f5c8-67sqq\" (UID: \"e9a04710-7320-4cde-9863-e05e65f54671\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.238597 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n7m5\" (UniqueName: \"kubernetes.io/projected/e9a04710-7320-4cde-9863-e05e65f54671-kube-api-access-8n7m5\") pod \"nmstate-operator-5b5b58f5c8-67sqq\" (UID: \"e9a04710-7320-4cde-9863-e05e65f54671\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.264328 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n7m5\" (UniqueName: \"kubernetes.io/projected/e9a04710-7320-4cde-9863-e05e65f54671-kube-api-access-8n7m5\") pod \"nmstate-operator-5b5b58f5c8-67sqq\" (UID: \"e9a04710-7320-4cde-9863-e05e65f54671\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.378599 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" Dec 02 18:30:38 crc kubenswrapper[4878]: I1202 18:30:38.928108 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq"] Dec 02 18:30:39 crc kubenswrapper[4878]: I1202 18:30:39.035139 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" event={"ID":"e9a04710-7320-4cde-9863-e05e65f54671","Type":"ContainerStarted","Data":"087392ae14bc35093924328e803fceae159937b8c055c0954fbde62bf2a35a09"} Dec 02 18:30:43 crc kubenswrapper[4878]: I1202 18:30:43.069006 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" event={"ID":"e9a04710-7320-4cde-9863-e05e65f54671","Type":"ContainerStarted","Data":"fa664e4191229518431ab465ef9298b8e473a2b322aa9684898317a5651a43ae"} Dec 02 18:30:43 crc kubenswrapper[4878]: I1202 18:30:43.099473 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67sqq" podStartSLOduration=1.97141359 podStartE2EDuration="5.099450289s" podCreationTimestamp="2025-12-02 18:30:38 +0000 UTC" firstStartedPulling="2025-12-02 18:30:38.943639818 +0000 UTC m=+948.633258699" lastFinishedPulling="2025-12-02 18:30:42.071676517 +0000 UTC m=+951.761295398" observedRunningTime="2025-12-02 18:30:43.094450855 +0000 UTC m=+952.784069746" watchObservedRunningTime="2025-12-02 18:30:43.099450289 +0000 UTC m=+952.789069170" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.527569 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.529559 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.532804 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ltfsr" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.547670 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.586312 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.587852 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.591453 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.610698 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.626351 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-v45r2"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.626773 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2qjn\" (UniqueName: \"kubernetes.io/projected/788e4f72-26a7-455f-b805-32b1b519726c-kube-api-access-n2qjn\") pod \"nmstate-metrics-7f946cbc9-mzs4g\" (UID: \"788e4f72-26a7-455f-b805-32b1b519726c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.626834 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79vw\" (UniqueName: \"kubernetes.io/projected/b86218c7-3d62-4631-8f95-e70b1f304615-kube-api-access-t79vw\") pod \"nmstate-webhook-5f6d4c5ccb-h4wxh\" (UID: \"b86218c7-3d62-4631-8f95-e70b1f304615\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.626864 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b86218c7-3d62-4631-8f95-e70b1f304615-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-h4wxh\" (UID: \"b86218c7-3d62-4631-8f95-e70b1f304615\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.627772 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.705765 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.706782 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.708991 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ftwsp" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.709324 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.709500 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.719647 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728523 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-ovs-socket\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728582 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/91633707-7b72-4b18-a516-a6b327dc44f1-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728613 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2qjn\" (UniqueName: \"kubernetes.io/projected/788e4f72-26a7-455f-b805-32b1b519726c-kube-api-access-n2qjn\") pod \"nmstate-metrics-7f946cbc9-mzs4g\" (UID: \"788e4f72-26a7-455f-b805-32b1b519726c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728634 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t79vw\" (UniqueName: \"kubernetes.io/projected/b86218c7-3d62-4631-8f95-e70b1f304615-kube-api-access-t79vw\") pod \"nmstate-webhook-5f6d4c5ccb-h4wxh\" (UID: \"b86218c7-3d62-4631-8f95-e70b1f304615\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728659 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b86218c7-3d62-4631-8f95-e70b1f304615-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-h4wxh\" (UID: \"b86218c7-3d62-4631-8f95-e70b1f304615\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728682 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/91633707-7b72-4b18-a516-a6b327dc44f1-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728700 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-nmstate-lock\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728725 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j9dd\" (UniqueName: \"kubernetes.io/projected/91633707-7b72-4b18-a516-a6b327dc44f1-kube-api-access-2j9dd\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728748 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5j4\" (UniqueName: \"kubernetes.io/projected/5a7dc067-d2fd-4145-b79c-33fac3675cdd-kube-api-access-bj5j4\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.728779 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-dbus-socket\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.750427 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b86218c7-3d62-4631-8f95-e70b1f304615-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-h4wxh\" (UID: \"b86218c7-3d62-4631-8f95-e70b1f304615\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.750706 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79vw\" (UniqueName: \"kubernetes.io/projected/b86218c7-3d62-4631-8f95-e70b1f304615-kube-api-access-t79vw\") pod \"nmstate-webhook-5f6d4c5ccb-h4wxh\" (UID: \"b86218c7-3d62-4631-8f95-e70b1f304615\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.754829 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2qjn\" (UniqueName: \"kubernetes.io/projected/788e4f72-26a7-455f-b805-32b1b519726c-kube-api-access-n2qjn\") pod \"nmstate-metrics-7f946cbc9-mzs4g\" (UID: \"788e4f72-26a7-455f-b805-32b1b519726c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830103 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-ovs-socket\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830193 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/91633707-7b72-4b18-a516-a6b327dc44f1-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830225 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/91633707-7b72-4b18-a516-a6b327dc44f1-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830264 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-nmstate-lock\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830287 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j9dd\" (UniqueName: \"kubernetes.io/projected/91633707-7b72-4b18-a516-a6b327dc44f1-kube-api-access-2j9dd\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830315 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5j4\" (UniqueName: \"kubernetes.io/projected/5a7dc067-d2fd-4145-b79c-33fac3675cdd-kube-api-access-bj5j4\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830349 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-dbus-socket\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.830777 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-dbus-socket\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: E1202 18:30:47.830899 4878 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 02 18:30:47 crc kubenswrapper[4878]: E1202 18:30:47.830954 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91633707-7b72-4b18-a516-a6b327dc44f1-plugin-serving-cert podName:91633707-7b72-4b18-a516-a6b327dc44f1 nodeName:}" failed. No retries permitted until 2025-12-02 18:30:48.330936325 +0000 UTC m=+958.020555206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/91633707-7b72-4b18-a516-a6b327dc44f1-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-88jcq" (UID: "91633707-7b72-4b18-a516-a6b327dc44f1") : secret "plugin-serving-cert" not found Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.831206 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-nmstate-lock\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.831428 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/91633707-7b72-4b18-a516-a6b327dc44f1-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.831489 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5a7dc067-d2fd-4145-b79c-33fac3675cdd-ovs-socket\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.854201 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j9dd\" (UniqueName: \"kubernetes.io/projected/91633707-7b72-4b18-a516-a6b327dc44f1-kube-api-access-2j9dd\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.862816 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.868056 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5j4\" (UniqueName: \"kubernetes.io/projected/5a7dc067-d2fd-4145-b79c-33fac3675cdd-kube-api-access-bj5j4\") pod \"nmstate-handler-v45r2\" (UID: \"5a7dc067-d2fd-4145-b79c-33fac3675cdd\") " pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.912042 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.935248 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-59b7b6c866-l2c75"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.949797 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.950214 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59b7b6c866-l2c75"] Dec 02 18:30:47 crc kubenswrapper[4878]: I1202 18:30:47.951213 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.123786 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v45r2" event={"ID":"5a7dc067-d2fd-4145-b79c-33fac3675cdd","Type":"ContainerStarted","Data":"fa9011c2087adc0ad8e5937343e40031bd5ab57a83a1b7fd338c722501cf994f"} Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.136892 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-oauth-config\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.136992 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-service-ca\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.137042 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-serving-cert\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.137084 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-trusted-ca-bundle\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.137151 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-oauth-serving-cert\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.137197 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-config\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.137216 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq9v8\" (UniqueName: \"kubernetes.io/projected/090140ec-1e0c-43b4-b71e-bbe2f9d45281-kube-api-access-dq9v8\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.212852 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g"] Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.239425 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-oauth-serving-cert\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.239505 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-config\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.239538 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq9v8\" (UniqueName: \"kubernetes.io/projected/090140ec-1e0c-43b4-b71e-bbe2f9d45281-kube-api-access-dq9v8\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.239566 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-oauth-config\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.239588 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-service-ca\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.239627 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-serving-cert\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.239674 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-trusted-ca-bundle\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.241495 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-oauth-serving-cert\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.242341 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-config\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.242941 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-trusted-ca-bundle\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.243084 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-service-ca\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.250716 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-oauth-config\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.251272 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-serving-cert\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.258778 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq9v8\" (UniqueName: \"kubernetes.io/projected/090140ec-1e0c-43b4-b71e-bbe2f9d45281-kube-api-access-dq9v8\") pod \"console-59b7b6c866-l2c75\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.277491 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.341886 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/91633707-7b72-4b18-a516-a6b327dc44f1-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.346157 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/91633707-7b72-4b18-a516-a6b327dc44f1-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-88jcq\" (UID: \"91633707-7b72-4b18-a516-a6b327dc44f1\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.495299 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh"] Dec 02 18:30:48 crc kubenswrapper[4878]: W1202 18:30:48.496301 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86218c7_3d62_4631_8f95_e70b1f304615.slice/crio-e63c2941628f6caaf9908cb1135c2d74d39a301355787e069c9970af45cae0a0 WatchSource:0}: Error finding container e63c2941628f6caaf9908cb1135c2d74d39a301355787e069c9970af45cae0a0: Status 404 returned error can't find the container with id e63c2941628f6caaf9908cb1135c2d74d39a301355787e069c9970af45cae0a0 Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.625090 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" Dec 02 18:30:48 crc kubenswrapper[4878]: I1202 18:30:48.833901 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59b7b6c866-l2c75"] Dec 02 18:30:49 crc kubenswrapper[4878]: I1202 18:30:49.137839 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b7b6c866-l2c75" event={"ID":"090140ec-1e0c-43b4-b71e-bbe2f9d45281","Type":"ContainerStarted","Data":"53e5f4dd68fda8971d31ca72683f6ea25b216f0df5f1d15216f5179f6d51965a"} Dec 02 18:30:49 crc kubenswrapper[4878]: I1202 18:30:49.138785 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq"] Dec 02 18:30:49 crc kubenswrapper[4878]: I1202 18:30:49.139029 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" event={"ID":"b86218c7-3d62-4631-8f95-e70b1f304615","Type":"ContainerStarted","Data":"e63c2941628f6caaf9908cb1135c2d74d39a301355787e069c9970af45cae0a0"} Dec 02 18:30:49 crc kubenswrapper[4878]: I1202 18:30:49.139963 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" event={"ID":"788e4f72-26a7-455f-b805-32b1b519726c","Type":"ContainerStarted","Data":"21296efa1c9d3d9508a696293c40c7ccc3eaf6f4a5d93824c7922685e1102084"} Dec 02 18:30:50 crc kubenswrapper[4878]: I1202 18:30:50.150622 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" event={"ID":"91633707-7b72-4b18-a516-a6b327dc44f1","Type":"ContainerStarted","Data":"47c90ec09b38fbcf320bdf0479a1cfaf08eb5230d2a06d072333d7bb50f7ee47"} Dec 02 18:30:50 crc kubenswrapper[4878]: I1202 18:30:50.152380 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b7b6c866-l2c75" event={"ID":"090140ec-1e0c-43b4-b71e-bbe2f9d45281","Type":"ContainerStarted","Data":"4d3a8810192f466bf2c48eb8e750538f1e1c5a7a09a64efdea03ed930276d53d"} Dec 02 18:30:50 crc kubenswrapper[4878]: I1202 18:30:50.186992 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59b7b6c866-l2c75" podStartSLOduration=3.186959063 podStartE2EDuration="3.186959063s" podCreationTimestamp="2025-12-02 18:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:30:50.174265074 +0000 UTC m=+959.863883965" watchObservedRunningTime="2025-12-02 18:30:50.186959063 +0000 UTC m=+959.876577984" Dec 02 18:30:53 crc kubenswrapper[4878]: I1202 18:30:53.226153 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v45r2" event={"ID":"5a7dc067-d2fd-4145-b79c-33fac3675cdd","Type":"ContainerStarted","Data":"ff64c98e5943629f484de4d32ab860ea8a66882efb0671b15dec99d6156cc77f"} Dec 02 18:30:53 crc kubenswrapper[4878]: I1202 18:30:53.226909 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:53 crc kubenswrapper[4878]: I1202 18:30:53.229223 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" event={"ID":"b86218c7-3d62-4631-8f95-e70b1f304615","Type":"ContainerStarted","Data":"30d722245eebf07fc6acab5102de5c9c0c83b764dd6a149cbacc4e39e06bee06"} Dec 02 18:30:53 crc kubenswrapper[4878]: I1202 18:30:53.230034 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:30:53 crc kubenswrapper[4878]: I1202 18:30:53.231367 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" event={"ID":"788e4f72-26a7-455f-b805-32b1b519726c","Type":"ContainerStarted","Data":"4fca7c8b5257c6f3a4d5aa4af1423faee130e17f0edb65ff367d010d91cecf8a"} Dec 02 18:30:53 crc kubenswrapper[4878]: I1202 18:30:53.251114 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-v45r2" podStartSLOduration=2.249440838 podStartE2EDuration="6.251096959s" podCreationTimestamp="2025-12-02 18:30:47 +0000 UTC" firstStartedPulling="2025-12-02 18:30:48.016015874 +0000 UTC m=+957.705634755" lastFinishedPulling="2025-12-02 18:30:52.017671995 +0000 UTC m=+961.707290876" observedRunningTime="2025-12-02 18:30:53.245157946 +0000 UTC m=+962.934776847" watchObservedRunningTime="2025-12-02 18:30:53.251096959 +0000 UTC m=+962.940715840" Dec 02 18:30:53 crc kubenswrapper[4878]: I1202 18:30:53.281847 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" podStartSLOduration=2.719446954 podStartE2EDuration="6.281817243s" podCreationTimestamp="2025-12-02 18:30:47 +0000 UTC" firstStartedPulling="2025-12-02 18:30:48.500421783 +0000 UTC m=+958.190040664" lastFinishedPulling="2025-12-02 18:30:52.062792072 +0000 UTC m=+961.752410953" observedRunningTime="2025-12-02 18:30:53.275797478 +0000 UTC m=+962.965416369" watchObservedRunningTime="2025-12-02 18:30:53.281817243 +0000 UTC m=+962.971436124" Dec 02 18:30:55 crc kubenswrapper[4878]: I1202 18:30:55.250889 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" event={"ID":"91633707-7b72-4b18-a516-a6b327dc44f1","Type":"ContainerStarted","Data":"ef70ee9063317dc0bca9ebafc3f77d3b4ad5f110c660ad3457787dfc992f3c05"} Dec 02 18:30:55 crc kubenswrapper[4878]: I1202 18:30:55.284643 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-88jcq" podStartSLOduration=3.478386941 podStartE2EDuration="8.284589343s" podCreationTimestamp="2025-12-02 18:30:47 +0000 UTC" firstStartedPulling="2025-12-02 18:30:49.148533174 +0000 UTC m=+958.838152055" lastFinishedPulling="2025-12-02 18:30:53.954735576 +0000 UTC m=+963.644354457" observedRunningTime="2025-12-02 18:30:55.276056411 +0000 UTC m=+964.965675292" watchObservedRunningTime="2025-12-02 18:30:55.284589343 +0000 UTC m=+964.974208234" Dec 02 18:30:57 crc kubenswrapper[4878]: I1202 18:30:57.272633 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" event={"ID":"788e4f72-26a7-455f-b805-32b1b519726c","Type":"ContainerStarted","Data":"8fabe4eafebcfed1fe0f1b652daac61b53af180fe25bbfa29302c0ffc3893e8d"} Dec 02 18:30:57 crc kubenswrapper[4878]: I1202 18:30:57.978409 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-v45r2" Dec 02 18:30:58 crc kubenswrapper[4878]: I1202 18:30:58.000543 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mzs4g" podStartSLOduration=3.020085904 podStartE2EDuration="11.000503124s" podCreationTimestamp="2025-12-02 18:30:47 +0000 UTC" firstStartedPulling="2025-12-02 18:30:48.228394162 +0000 UTC m=+957.918013043" lastFinishedPulling="2025-12-02 18:30:56.208811352 +0000 UTC m=+965.898430263" observedRunningTime="2025-12-02 18:30:57.307087221 +0000 UTC m=+966.996706102" watchObservedRunningTime="2025-12-02 18:30:58.000503124 +0000 UTC m=+967.690122005" Dec 02 18:30:58 crc kubenswrapper[4878]: I1202 18:30:58.278208 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:58 crc kubenswrapper[4878]: I1202 18:30:58.278309 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:58 crc kubenswrapper[4878]: I1202 18:30:58.288183 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:59 crc kubenswrapper[4878]: I1202 18:30:59.295492 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:30:59 crc kubenswrapper[4878]: I1202 18:30:59.428719 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bc54f4949-8wvcm"] Dec 02 18:31:07 crc kubenswrapper[4878]: I1202 18:31:07.919790 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h4wxh" Dec 02 18:31:23 crc kubenswrapper[4878]: I1202 18:31:23.742911 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:31:23 crc kubenswrapper[4878]: I1202 18:31:23.743718 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:31:24 crc kubenswrapper[4878]: I1202 18:31:24.476014 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7bc54f4949-8wvcm" podUID="8c205c22-515e-4834-a53d-30d85e34596f" containerName="console" containerID="cri-o://5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340" gracePeriod=15 Dec 02 18:31:24 crc kubenswrapper[4878]: I1202 18:31:24.983484 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bc54f4949-8wvcm_8c205c22-515e-4834-a53d-30d85e34596f/console/0.log" Dec 02 18:31:24 crc kubenswrapper[4878]: I1202 18:31:24.983885 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.084943 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-oauth-config\") pod \"8c205c22-515e-4834-a53d-30d85e34596f\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.085008 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-trusted-ca-bundle\") pod \"8c205c22-515e-4834-a53d-30d85e34596f\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.085037 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-serving-cert\") pod \"8c205c22-515e-4834-a53d-30d85e34596f\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.085277 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-service-ca\") pod \"8c205c22-515e-4834-a53d-30d85e34596f\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.085308 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmmjm\" (UniqueName: \"kubernetes.io/projected/8c205c22-515e-4834-a53d-30d85e34596f-kube-api-access-hmmjm\") pod \"8c205c22-515e-4834-a53d-30d85e34596f\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.085585 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-oauth-serving-cert\") pod \"8c205c22-515e-4834-a53d-30d85e34596f\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.085622 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-console-config\") pod \"8c205c22-515e-4834-a53d-30d85e34596f\" (UID: \"8c205c22-515e-4834-a53d-30d85e34596f\") " Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.086583 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-console-config" (OuterVolumeSpecName: "console-config") pod "8c205c22-515e-4834-a53d-30d85e34596f" (UID: "8c205c22-515e-4834-a53d-30d85e34596f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.086573 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-service-ca" (OuterVolumeSpecName: "service-ca") pod "8c205c22-515e-4834-a53d-30d85e34596f" (UID: "8c205c22-515e-4834-a53d-30d85e34596f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.087070 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8c205c22-515e-4834-a53d-30d85e34596f" (UID: "8c205c22-515e-4834-a53d-30d85e34596f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.087183 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8c205c22-515e-4834-a53d-30d85e34596f" (UID: "8c205c22-515e-4834-a53d-30d85e34596f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.092193 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8c205c22-515e-4834-a53d-30d85e34596f" (UID: "8c205c22-515e-4834-a53d-30d85e34596f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.093584 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c205c22-515e-4834-a53d-30d85e34596f-kube-api-access-hmmjm" (OuterVolumeSpecName: "kube-api-access-hmmjm") pod "8c205c22-515e-4834-a53d-30d85e34596f" (UID: "8c205c22-515e-4834-a53d-30d85e34596f"). InnerVolumeSpecName "kube-api-access-hmmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.100197 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8c205c22-515e-4834-a53d-30d85e34596f" (UID: "8c205c22-515e-4834-a53d-30d85e34596f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.187350 4878 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.187405 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.187424 4878 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c205c22-515e-4834-a53d-30d85e34596f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.187446 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.187464 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmmjm\" (UniqueName: \"kubernetes.io/projected/8c205c22-515e-4834-a53d-30d85e34596f-kube-api-access-hmmjm\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.187482 4878 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.187499 4878 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c205c22-515e-4834-a53d-30d85e34596f-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.535964 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bc54f4949-8wvcm_8c205c22-515e-4834-a53d-30d85e34596f/console/0.log" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.536036 4878 generic.go:334] "Generic (PLEG): container finished" podID="8c205c22-515e-4834-a53d-30d85e34596f" containerID="5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340" exitCode=2 Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.536077 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc54f4949-8wvcm" event={"ID":"8c205c22-515e-4834-a53d-30d85e34596f","Type":"ContainerDied","Data":"5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340"} Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.536125 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc54f4949-8wvcm" event={"ID":"8c205c22-515e-4834-a53d-30d85e34596f","Type":"ContainerDied","Data":"e80bbf64fbf2f477b456d938d5240d83c2bf6cd330a3436abcaaed3406b13923"} Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.536144 4878 scope.go:117] "RemoveContainer" containerID="5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.536389 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc54f4949-8wvcm" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.598472 4878 scope.go:117] "RemoveContainer" containerID="5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340" Dec 02 18:31:25 crc kubenswrapper[4878]: E1202 18:31:25.600909 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340\": container with ID starting with 5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340 not found: ID does not exist" containerID="5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.600946 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340"} err="failed to get container status \"5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340\": rpc error: code = NotFound desc = could not find container \"5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340\": container with ID starting with 5d3869b300499d0b783fe6d602afb1809425186dbc1ef3d076fcc6d577bc8340 not found: ID does not exist" Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.608282 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bc54f4949-8wvcm"] Dec 02 18:31:25 crc kubenswrapper[4878]: I1202 18:31:25.624424 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bc54f4949-8wvcm"] Dec 02 18:31:26 crc kubenswrapper[4878]: I1202 18:31:26.948719 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c205c22-515e-4834-a53d-30d85e34596f" path="/var/lib/kubelet/pods/8c205c22-515e-4834-a53d-30d85e34596f/volumes" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.725284 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt"] Dec 02 18:31:27 crc kubenswrapper[4878]: E1202 18:31:27.725703 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c205c22-515e-4834-a53d-30d85e34596f" containerName="console" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.725717 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c205c22-515e-4834-a53d-30d85e34596f" containerName="console" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.725861 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c205c22-515e-4834-a53d-30d85e34596f" containerName="console" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.727153 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.729550 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.737287 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsgvl\" (UniqueName: \"kubernetes.io/projected/caa9ae90-89de-4467-a6e1-1043d45ed9e8-kube-api-access-tsgvl\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.737346 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.737434 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.738401 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt"] Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.838120 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.838552 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgvl\" (UniqueName: \"kubernetes.io/projected/caa9ae90-89de-4467-a6e1-1043d45ed9e8-kube-api-access-tsgvl\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.838590 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.838715 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.839073 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:27 crc kubenswrapper[4878]: I1202 18:31:27.861648 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsgvl\" (UniqueName: \"kubernetes.io/projected/caa9ae90-89de-4467-a6e1-1043d45ed9e8-kube-api-access-tsgvl\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:28 crc kubenswrapper[4878]: I1202 18:31:28.046540 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:28 crc kubenswrapper[4878]: I1202 18:31:28.517089 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt"] Dec 02 18:31:28 crc kubenswrapper[4878]: I1202 18:31:28.566080 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" event={"ID":"caa9ae90-89de-4467-a6e1-1043d45ed9e8","Type":"ContainerStarted","Data":"c95bba45f71f52b5a26071f6c65db29f938358d33024f4c6c9f869bfda132663"} Dec 02 18:31:29 crc kubenswrapper[4878]: I1202 18:31:29.575962 4878 generic.go:334] "Generic (PLEG): container finished" podID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerID="860c4830590ec03ba4fcf68bc8256125c80710a76cb9fc8c685c7ca810d5f2c6" exitCode=0 Dec 02 18:31:29 crc kubenswrapper[4878]: I1202 18:31:29.576057 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" event={"ID":"caa9ae90-89de-4467-a6e1-1043d45ed9e8","Type":"ContainerDied","Data":"860c4830590ec03ba4fcf68bc8256125c80710a76cb9fc8c685c7ca810d5f2c6"} Dec 02 18:31:29 crc kubenswrapper[4878]: I1202 18:31:29.579042 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 18:31:31 crc kubenswrapper[4878]: I1202 18:31:31.593680 4878 generic.go:334] "Generic (PLEG): container finished" podID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerID="dbdbe63231581a1dadd47d5e8e2e1f7252cf20cfb1c53e3b1ea8e3d5d7a830c3" exitCode=0 Dec 02 18:31:31 crc kubenswrapper[4878]: I1202 18:31:31.593760 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" event={"ID":"caa9ae90-89de-4467-a6e1-1043d45ed9e8","Type":"ContainerDied","Data":"dbdbe63231581a1dadd47d5e8e2e1f7252cf20cfb1c53e3b1ea8e3d5d7a830c3"} Dec 02 18:31:32 crc kubenswrapper[4878]: I1202 18:31:32.607704 4878 generic.go:334] "Generic (PLEG): container finished" podID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerID="e24271a53ca5b4f66dfa8b96a8c0139787c7a1e2d22a42e05c99db9c06df8a3b" exitCode=0 Dec 02 18:31:32 crc kubenswrapper[4878]: I1202 18:31:32.607830 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" event={"ID":"caa9ae90-89de-4467-a6e1-1043d45ed9e8","Type":"ContainerDied","Data":"e24271a53ca5b4f66dfa8b96a8c0139787c7a1e2d22a42e05c99db9c06df8a3b"} Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.047778 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.080035 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-bundle\") pod \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.080153 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsgvl\" (UniqueName: \"kubernetes.io/projected/caa9ae90-89de-4467-a6e1-1043d45ed9e8-kube-api-access-tsgvl\") pod \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.080216 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-util\") pod \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\" (UID: \"caa9ae90-89de-4467-a6e1-1043d45ed9e8\") " Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.081499 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-bundle" (OuterVolumeSpecName: "bundle") pod "caa9ae90-89de-4467-a6e1-1043d45ed9e8" (UID: "caa9ae90-89de-4467-a6e1-1043d45ed9e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.093547 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa9ae90-89de-4467-a6e1-1043d45ed9e8-kube-api-access-tsgvl" (OuterVolumeSpecName: "kube-api-access-tsgvl") pod "caa9ae90-89de-4467-a6e1-1043d45ed9e8" (UID: "caa9ae90-89de-4467-a6e1-1043d45ed9e8"). InnerVolumeSpecName "kube-api-access-tsgvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.182291 4878 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.182328 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsgvl\" (UniqueName: \"kubernetes.io/projected/caa9ae90-89de-4467-a6e1-1043d45ed9e8-kube-api-access-tsgvl\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.263231 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-util" (OuterVolumeSpecName: "util") pod "caa9ae90-89de-4467-a6e1-1043d45ed9e8" (UID: "caa9ae90-89de-4467-a6e1-1043d45ed9e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.283897 4878 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/caa9ae90-89de-4467-a6e1-1043d45ed9e8-util\") on node \"crc\" DevicePath \"\"" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.629191 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" event={"ID":"caa9ae90-89de-4467-a6e1-1043d45ed9e8","Type":"ContainerDied","Data":"c95bba45f71f52b5a26071f6c65db29f938358d33024f4c6c9f869bfda132663"} Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.629276 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt" Dec 02 18:31:34 crc kubenswrapper[4878]: I1202 18:31:34.629280 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95bba45f71f52b5a26071f6c65db29f938358d33024f4c6c9f869bfda132663" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.596688 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6"] Dec 02 18:31:42 crc kubenswrapper[4878]: E1202 18:31:42.597959 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerName="util" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.597975 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerName="util" Dec 02 18:31:42 crc kubenswrapper[4878]: E1202 18:31:42.597996 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerName="extract" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.598003 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerName="extract" Dec 02 18:31:42 crc kubenswrapper[4878]: E1202 18:31:42.598026 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerName="pull" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.598033 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerName="pull" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.598175 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa9ae90-89de-4467-a6e1-1043d45ed9e8" containerName="extract" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.598943 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.601583 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.601682 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.602747 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.602923 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.602946 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lhvbf" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.623398 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6"] Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.638142 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cda42159-8d1b-460a-b92e-a02db29c88e9-apiservice-cert\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.638218 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cda42159-8d1b-460a-b92e-a02db29c88e9-webhook-cert\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.638634 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gp5x\" (UniqueName: \"kubernetes.io/projected/cda42159-8d1b-460a-b92e-a02db29c88e9-kube-api-access-4gp5x\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.741085 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cda42159-8d1b-460a-b92e-a02db29c88e9-apiservice-cert\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.741140 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cda42159-8d1b-460a-b92e-a02db29c88e9-webhook-cert\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.741253 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gp5x\" (UniqueName: \"kubernetes.io/projected/cda42159-8d1b-460a-b92e-a02db29c88e9-kube-api-access-4gp5x\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.756399 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cda42159-8d1b-460a-b92e-a02db29c88e9-webhook-cert\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.763333 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cda42159-8d1b-460a-b92e-a02db29c88e9-apiservice-cert\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.798683 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gp5x\" (UniqueName: \"kubernetes.io/projected/cda42159-8d1b-460a-b92e-a02db29c88e9-kube-api-access-4gp5x\") pod \"metallb-operator-controller-manager-7b867f79f6-k2zm6\" (UID: \"cda42159-8d1b-460a-b92e-a02db29c88e9\") " pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:42 crc kubenswrapper[4878]: I1202 18:31:42.917470 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.065763 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l"] Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.069947 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.075645 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-px49k" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.076064 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.076074 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.094744 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l"] Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.210353 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/239083bd-7b88-46e8-b5e6-b1fdb9abc032-webhook-cert\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.210447 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgjhr\" (UniqueName: \"kubernetes.io/projected/239083bd-7b88-46e8-b5e6-b1fdb9abc032-kube-api-access-sgjhr\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.210472 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/239083bd-7b88-46e8-b5e6-b1fdb9abc032-apiservice-cert\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.312261 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/239083bd-7b88-46e8-b5e6-b1fdb9abc032-webhook-cert\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.312377 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgjhr\" (UniqueName: \"kubernetes.io/projected/239083bd-7b88-46e8-b5e6-b1fdb9abc032-kube-api-access-sgjhr\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.312411 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/239083bd-7b88-46e8-b5e6-b1fdb9abc032-apiservice-cert\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.316925 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/239083bd-7b88-46e8-b5e6-b1fdb9abc032-apiservice-cert\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.317029 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/239083bd-7b88-46e8-b5e6-b1fdb9abc032-webhook-cert\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.326874 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgjhr\" (UniqueName: \"kubernetes.io/projected/239083bd-7b88-46e8-b5e6-b1fdb9abc032-kube-api-access-sgjhr\") pod \"metallb-operator-webhook-server-fd8d8f689-k4m6l\" (UID: \"239083bd-7b88-46e8-b5e6-b1fdb9abc032\") " pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:43 crc kubenswrapper[4878]: I1202 18:31:43.398784 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:44 crc kubenswrapper[4878]: I1202 18:31:44.047194 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6"] Dec 02 18:31:44 crc kubenswrapper[4878]: W1202 18:31:44.063541 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda42159_8d1b_460a_b92e_a02db29c88e9.slice/crio-4c9bb7500785749d4156b220e1c50dd93ef55d488be53fbe4526f957f3b21d94 WatchSource:0}: Error finding container 4c9bb7500785749d4156b220e1c50dd93ef55d488be53fbe4526f957f3b21d94: Status 404 returned error can't find the container with id 4c9bb7500785749d4156b220e1c50dd93ef55d488be53fbe4526f957f3b21d94 Dec 02 18:31:44 crc kubenswrapper[4878]: I1202 18:31:44.182394 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l"] Dec 02 18:31:44 crc kubenswrapper[4878]: I1202 18:31:44.721757 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" event={"ID":"cda42159-8d1b-460a-b92e-a02db29c88e9","Type":"ContainerStarted","Data":"4c9bb7500785749d4156b220e1c50dd93ef55d488be53fbe4526f957f3b21d94"} Dec 02 18:31:44 crc kubenswrapper[4878]: I1202 18:31:44.723448 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" event={"ID":"239083bd-7b88-46e8-b5e6-b1fdb9abc032","Type":"ContainerStarted","Data":"81de38a8e012813deeac42f35b7b95520789b9e730df8a41b44e8539134f6a9b"} Dec 02 18:31:52 crc kubenswrapper[4878]: I1202 18:31:52.864422 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" event={"ID":"cda42159-8d1b-460a-b92e-a02db29c88e9","Type":"ContainerStarted","Data":"868d15c2fe07dae46250c3ced3bc74111c02f373021e727edba5babf0e556989"} Dec 02 18:31:52 crc kubenswrapper[4878]: I1202 18:31:52.865535 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:31:52 crc kubenswrapper[4878]: I1202 18:31:52.866697 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" event={"ID":"239083bd-7b88-46e8-b5e6-b1fdb9abc032","Type":"ContainerStarted","Data":"6b6a4a743a5b9cb6f8e92c2a40821a6c02291c886c041693ebfba6224b9ae957"} Dec 02 18:31:52 crc kubenswrapper[4878]: I1202 18:31:52.867205 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:31:52 crc kubenswrapper[4878]: I1202 18:31:52.909855 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" podStartSLOduration=3.172365084 podStartE2EDuration="10.909826296s" podCreationTimestamp="2025-12-02 18:31:42 +0000 UTC" firstStartedPulling="2025-12-02 18:31:44.06610917 +0000 UTC m=+1013.755728041" lastFinishedPulling="2025-12-02 18:31:51.803570382 +0000 UTC m=+1021.493189253" observedRunningTime="2025-12-02 18:31:52.898386015 +0000 UTC m=+1022.588004946" watchObservedRunningTime="2025-12-02 18:31:52.909826296 +0000 UTC m=+1022.599445217" Dec 02 18:31:52 crc kubenswrapper[4878]: I1202 18:31:52.935040 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" podStartSLOduration=2.280292981 podStartE2EDuration="9.935012081s" podCreationTimestamp="2025-12-02 18:31:43 +0000 UTC" firstStartedPulling="2025-12-02 18:31:44.220620948 +0000 UTC m=+1013.910239849" lastFinishedPulling="2025-12-02 18:31:51.875340068 +0000 UTC m=+1021.564958949" observedRunningTime="2025-12-02 18:31:52.932161643 +0000 UTC m=+1022.621780534" watchObservedRunningTime="2025-12-02 18:31:52.935012081 +0000 UTC m=+1022.624631012" Dec 02 18:31:53 crc kubenswrapper[4878]: I1202 18:31:53.742560 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:31:53 crc kubenswrapper[4878]: I1202 18:31:53.742640 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:32:03 crc kubenswrapper[4878]: I1202 18:32:03.413206 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-fd8d8f689-k4m6l" Dec 02 18:32:22 crc kubenswrapper[4878]: I1202 18:32:22.921482 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b867f79f6-k2zm6" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.732031 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2qjj9"] Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.736572 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.736974 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t"] Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.737972 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.742324 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.742375 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.742419 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.743184 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e692d5eeca0be5391ffb074305f4ee4fcb35693cb015b4c1a01e012767df5a57"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.743257 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://e692d5eeca0be5391ffb074305f4ee4fcb35693cb015b4c1a01e012767df5a57" gracePeriod=600 Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.743688 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.743851 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.743894 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.749505 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xlr4c" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.756921 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb4be508-3a9c-48d3-a995-124bf91a4128-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p7f4t\" (UID: \"bb4be508-3a9c-48d3-a995-124bf91a4128\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.756962 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4q5\" (UniqueName: \"kubernetes.io/projected/d7581b57-41bb-4843-89cb-00b1966ccd8e-kube-api-access-qp4q5\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.756989 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-conf\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.757022 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-reloader\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.757075 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.757094 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics-certs\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.757224 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx2vz\" (UniqueName: \"kubernetes.io/projected/bb4be508-3a9c-48d3-a995-124bf91a4128-kube-api-access-fx2vz\") pod \"frr-k8s-webhook-server-7fcb986d4-p7f4t\" (UID: \"bb4be508-3a9c-48d3-a995-124bf91a4128\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.757398 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-startup\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.757534 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-sockets\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.779431 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t"] Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863414 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-startup\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863524 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-sockets\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863606 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4q5\" (UniqueName: \"kubernetes.io/projected/d7581b57-41bb-4843-89cb-00b1966ccd8e-kube-api-access-qp4q5\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863629 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb4be508-3a9c-48d3-a995-124bf91a4128-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p7f4t\" (UID: \"bb4be508-3a9c-48d3-a995-124bf91a4128\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863651 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-conf\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863668 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-reloader\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863707 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863726 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics-certs\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.863754 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx2vz\" (UniqueName: \"kubernetes.io/projected/bb4be508-3a9c-48d3-a995-124bf91a4128-kube-api-access-fx2vz\") pod \"frr-k8s-webhook-server-7fcb986d4-p7f4t\" (UID: \"bb4be508-3a9c-48d3-a995-124bf91a4128\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.864372 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-sockets\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.864443 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-startup\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.864694 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-reloader\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: E1202 18:32:23.864774 4878 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 02 18:32:23 crc kubenswrapper[4878]: E1202 18:32:23.864821 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb4be508-3a9c-48d3-a995-124bf91a4128-cert podName:bb4be508-3a9c-48d3-a995-124bf91a4128 nodeName:}" failed. No retries permitted until 2025-12-02 18:32:24.364807227 +0000 UTC m=+1054.054426108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb4be508-3a9c-48d3-a995-124bf91a4128-cert") pod "frr-k8s-webhook-server-7fcb986d4-p7f4t" (UID: "bb4be508-3a9c-48d3-a995-124bf91a4128") : secret "frr-k8s-webhook-server-cert" not found Dec 02 18:32:23 crc kubenswrapper[4878]: E1202 18:32:23.866065 4878 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 02 18:32:23 crc kubenswrapper[4878]: E1202 18:32:23.866101 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics-certs podName:d7581b57-41bb-4843-89cb-00b1966ccd8e nodeName:}" failed. No retries permitted until 2025-12-02 18:32:24.366091483 +0000 UTC m=+1054.055710364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics-certs") pod "frr-k8s-2qjj9" (UID: "d7581b57-41bb-4843-89cb-00b1966ccd8e") : secret "frr-k8s-certs-secret" not found Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.874931 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.875533 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7581b57-41bb-4843-89cb-00b1966ccd8e-frr-conf\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.911118 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx2vz\" (UniqueName: \"kubernetes.io/projected/bb4be508-3a9c-48d3-a995-124bf91a4128-kube-api-access-fx2vz\") pod \"frr-k8s-webhook-server-7fcb986d4-p7f4t\" (UID: \"bb4be508-3a9c-48d3-a995-124bf91a4128\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.913287 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4q4nt"] Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.914824 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4q4nt" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.923758 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tzncj" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.924034 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.924153 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.927939 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4q5\" (UniqueName: \"kubernetes.io/projected/d7581b57-41bb-4843-89cb-00b1966ccd8e-kube-api-access-qp4q5\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.930459 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 18:32:23 crc kubenswrapper[4878]: I1202 18:32:23.962318 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-jmg7l"] Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.001028 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-jmg7l"] Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.001174 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.007712 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.093473 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-metrics-certs\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.093576 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.093811 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-metrics-certs\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.093867 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/73bda1d1-d063-44f4-8b13-20af22c61540-metallb-excludel2\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.093893 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmg5w\" (UniqueName: \"kubernetes.io/projected/f49a3efc-d73c-4b26-b668-8abf574eb6a9-kube-api-access-jmg5w\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.094037 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdxtk\" (UniqueName: \"kubernetes.io/projected/73bda1d1-d063-44f4-8b13-20af22c61540-kube-api-access-wdxtk\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.094153 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-cert\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.177460 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="e692d5eeca0be5391ffb074305f4ee4fcb35693cb015b4c1a01e012767df5a57" exitCode=0 Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.177563 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"e692d5eeca0be5391ffb074305f4ee4fcb35693cb015b4c1a01e012767df5a57"} Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.177613 4878 scope.go:117] "RemoveContainer" containerID="b46b425128f8cb8574d53391fe3090841c533ef0911e243412874ecbe8a5c8b9" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.196223 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdxtk\" (UniqueName: \"kubernetes.io/projected/73bda1d1-d063-44f4-8b13-20af22c61540-kube-api-access-wdxtk\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.196358 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-cert\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.196445 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-metrics-certs\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.196478 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.196549 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-metrics-certs\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.196575 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/73bda1d1-d063-44f4-8b13-20af22c61540-metallb-excludel2\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.196600 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmg5w\" (UniqueName: \"kubernetes.io/projected/f49a3efc-d73c-4b26-b668-8abf574eb6a9-kube-api-access-jmg5w\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: E1202 18:32:24.198415 4878 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 02 18:32:24 crc kubenswrapper[4878]: E1202 18:32:24.198492 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-metrics-certs podName:f49a3efc-d73c-4b26-b668-8abf574eb6a9 nodeName:}" failed. No retries permitted until 2025-12-02 18:32:24.698471404 +0000 UTC m=+1054.388090285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-metrics-certs") pod "controller-f8648f98b-jmg7l" (UID: "f49a3efc-d73c-4b26-b668-8abf574eb6a9") : secret "controller-certs-secret" not found Dec 02 18:32:24 crc kubenswrapper[4878]: E1202 18:32:24.198519 4878 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 18:32:24 crc kubenswrapper[4878]: E1202 18:32:24.198600 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist podName:73bda1d1-d063-44f4-8b13-20af22c61540 nodeName:}" failed. No retries permitted until 2025-12-02 18:32:24.698579527 +0000 UTC m=+1054.388198408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist") pod "speaker-4q4nt" (UID: "73bda1d1-d063-44f4-8b13-20af22c61540") : secret "metallb-memberlist" not found Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.199745 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/73bda1d1-d063-44f4-8b13-20af22c61540-metallb-excludel2\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.202455 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-metrics-certs\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.208100 4878 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.211785 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-cert\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.222812 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdxtk\" (UniqueName: \"kubernetes.io/projected/73bda1d1-d063-44f4-8b13-20af22c61540-kube-api-access-wdxtk\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.239599 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmg5w\" (UniqueName: \"kubernetes.io/projected/f49a3efc-d73c-4b26-b668-8abf574eb6a9-kube-api-access-jmg5w\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.399937 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb4be508-3a9c-48d3-a995-124bf91a4128-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p7f4t\" (UID: \"bb4be508-3a9c-48d3-a995-124bf91a4128\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.400025 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics-certs\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.404527 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7581b57-41bb-4843-89cb-00b1966ccd8e-metrics-certs\") pod \"frr-k8s-2qjj9\" (UID: \"d7581b57-41bb-4843-89cb-00b1966ccd8e\") " pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.404803 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb4be508-3a9c-48d3-a995-124bf91a4128-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-p7f4t\" (UID: \"bb4be508-3a9c-48d3-a995-124bf91a4128\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.664894 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.683761 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.705647 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-metrics-certs\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.705826 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:24 crc kubenswrapper[4878]: E1202 18:32:24.705958 4878 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 18:32:24 crc kubenswrapper[4878]: E1202 18:32:24.706028 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist podName:73bda1d1-d063-44f4-8b13-20af22c61540 nodeName:}" failed. No retries permitted until 2025-12-02 18:32:25.706010373 +0000 UTC m=+1055.395629264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist") pod "speaker-4q4nt" (UID: "73bda1d1-d063-44f4-8b13-20af22c61540") : secret "metallb-memberlist" not found Dec 02 18:32:24 crc kubenswrapper[4878]: I1202 18:32:24.712383 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f49a3efc-d73c-4b26-b668-8abf574eb6a9-metrics-certs\") pod \"controller-f8648f98b-jmg7l\" (UID: \"f49a3efc-d73c-4b26-b668-8abf574eb6a9\") " pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:25 crc kubenswrapper[4878]: I1202 18:32:25.011259 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:25 crc kubenswrapper[4878]: I1202 18:32:25.186020 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerStarted","Data":"58293a8484dd5aaf8990f29c88d2db5c8f89469720ea74756be25b60a525d153"} Dec 02 18:32:25 crc kubenswrapper[4878]: I1202 18:32:25.194783 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"26923c15a81965f0afaf8fe206a0c93db8beb9097433b36b1189c363d7056d26"} Dec 02 18:32:25 crc kubenswrapper[4878]: I1202 18:32:25.298303 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t"] Dec 02 18:32:25 crc kubenswrapper[4878]: I1202 18:32:25.510807 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-jmg7l"] Dec 02 18:32:25 crc kubenswrapper[4878]: I1202 18:32:25.728734 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:25 crc kubenswrapper[4878]: E1202 18:32:25.729004 4878 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 18:32:25 crc kubenswrapper[4878]: E1202 18:32:25.729132 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist podName:73bda1d1-d063-44f4-8b13-20af22c61540 nodeName:}" failed. No retries permitted until 2025-12-02 18:32:27.729100313 +0000 UTC m=+1057.418719244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist") pod "speaker-4q4nt" (UID: "73bda1d1-d063-44f4-8b13-20af22c61540") : secret "metallb-memberlist" not found Dec 02 18:32:26 crc kubenswrapper[4878]: I1202 18:32:26.206124 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" event={"ID":"bb4be508-3a9c-48d3-a995-124bf91a4128","Type":"ContainerStarted","Data":"5154f71dec65ec24457f13397c1de269f4ab37fd6c78d2cafb48dcc7b978d168"} Dec 02 18:32:26 crc kubenswrapper[4878]: I1202 18:32:26.214361 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-jmg7l" event={"ID":"f49a3efc-d73c-4b26-b668-8abf574eb6a9","Type":"ContainerStarted","Data":"df4b5f4387f37b5084a1429a76200bf494efbafc7b395b9746345bd6d087576a"} Dec 02 18:32:26 crc kubenswrapper[4878]: I1202 18:32:26.214416 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-jmg7l" event={"ID":"f49a3efc-d73c-4b26-b668-8abf574eb6a9","Type":"ContainerStarted","Data":"f63bf2d0f3b7a1600b3f3e92960beb8274dac5bb74254241eb25f185ae84dd1e"} Dec 02 18:32:26 crc kubenswrapper[4878]: I1202 18:32:26.214427 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-jmg7l" event={"ID":"f49a3efc-d73c-4b26-b668-8abf574eb6a9","Type":"ContainerStarted","Data":"4297905c41c89bb311ff1ac9d8f16470fb3bfc20ca907614843358b4265e89a1"} Dec 02 18:32:26 crc kubenswrapper[4878]: I1202 18:32:26.214543 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:26 crc kubenswrapper[4878]: I1202 18:32:26.243367 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-jmg7l" podStartSLOduration=3.243344728 podStartE2EDuration="3.243344728s" podCreationTimestamp="2025-12-02 18:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:32:26.232217648 +0000 UTC m=+1055.921836529" watchObservedRunningTime="2025-12-02 18:32:26.243344728 +0000 UTC m=+1055.932963609" Dec 02 18:32:27 crc kubenswrapper[4878]: I1202 18:32:27.822973 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:27 crc kubenswrapper[4878]: I1202 18:32:27.833701 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/73bda1d1-d063-44f4-8b13-20af22c61540-memberlist\") pod \"speaker-4q4nt\" (UID: \"73bda1d1-d063-44f4-8b13-20af22c61540\") " pod="metallb-system/speaker-4q4nt" Dec 02 18:32:27 crc kubenswrapper[4878]: I1202 18:32:27.978516 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4q4nt" Dec 02 18:32:28 crc kubenswrapper[4878]: I1202 18:32:28.240314 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4q4nt" event={"ID":"73bda1d1-d063-44f4-8b13-20af22c61540","Type":"ContainerStarted","Data":"2f6c6c7f0006286a8757458ddc53a46e1947d953f12308225b78e034c8eab29a"} Dec 02 18:32:29 crc kubenswrapper[4878]: I1202 18:32:29.267231 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4q4nt" event={"ID":"73bda1d1-d063-44f4-8b13-20af22c61540","Type":"ContainerStarted","Data":"40dabf3c4621d292a5f8ceb9ad4f0ce6d3ae69d5034f0b7cc24ecf1aa73c0f45"} Dec 02 18:32:29 crc kubenswrapper[4878]: I1202 18:32:29.268059 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4q4nt" event={"ID":"73bda1d1-d063-44f4-8b13-20af22c61540","Type":"ContainerStarted","Data":"01554ab7010b9e8ee758d400d4ad92871ca03b092790b66474d590cd5fddbf10"} Dec 02 18:32:29 crc kubenswrapper[4878]: I1202 18:32:29.268499 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4q4nt" Dec 02 18:32:29 crc kubenswrapper[4878]: I1202 18:32:29.328007 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4q4nt" podStartSLOduration=6.327989526 podStartE2EDuration="6.327989526s" podCreationTimestamp="2025-12-02 18:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:32:29.314221442 +0000 UTC m=+1059.003840333" watchObservedRunningTime="2025-12-02 18:32:29.327989526 +0000 UTC m=+1059.017608407" Dec 02 18:32:35 crc kubenswrapper[4878]: I1202 18:32:35.015022 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-jmg7l" Dec 02 18:32:35 crc kubenswrapper[4878]: I1202 18:32:35.394155 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" event={"ID":"bb4be508-3a9c-48d3-a995-124bf91a4128","Type":"ContainerStarted","Data":"d33b568497b99d7eaa25970c728518f033320ca710a26db1c884e021ae66a69d"} Dec 02 18:32:35 crc kubenswrapper[4878]: I1202 18:32:35.394706 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:35 crc kubenswrapper[4878]: I1202 18:32:35.396716 4878 generic.go:334] "Generic (PLEG): container finished" podID="d7581b57-41bb-4843-89cb-00b1966ccd8e" containerID="a2938f8665ed5d1632c91af41e137c778b0a4962524e45354a2f88f8111af8e9" exitCode=0 Dec 02 18:32:35 crc kubenswrapper[4878]: I1202 18:32:35.396823 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerDied","Data":"a2938f8665ed5d1632c91af41e137c778b0a4962524e45354a2f88f8111af8e9"} Dec 02 18:32:35 crc kubenswrapper[4878]: I1202 18:32:35.419750 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" podStartSLOduration=3.2717845309999998 podStartE2EDuration="12.419721909s" podCreationTimestamp="2025-12-02 18:32:23 +0000 UTC" firstStartedPulling="2025-12-02 18:32:25.29884117 +0000 UTC m=+1054.988460051" lastFinishedPulling="2025-12-02 18:32:34.446778548 +0000 UTC m=+1064.136397429" observedRunningTime="2025-12-02 18:32:35.418562657 +0000 UTC m=+1065.108181598" watchObservedRunningTime="2025-12-02 18:32:35.419721909 +0000 UTC m=+1065.109340790" Dec 02 18:32:36 crc kubenswrapper[4878]: I1202 18:32:36.413700 4878 generic.go:334] "Generic (PLEG): container finished" podID="d7581b57-41bb-4843-89cb-00b1966ccd8e" containerID="08dd2c475c19eb9b4ed195e53358b32dca0730f9b4e974ec7f6c7434097d7cd4" exitCode=0 Dec 02 18:32:36 crc kubenswrapper[4878]: I1202 18:32:36.413842 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerDied","Data":"08dd2c475c19eb9b4ed195e53358b32dca0730f9b4e974ec7f6c7434097d7cd4"} Dec 02 18:32:37 crc kubenswrapper[4878]: I1202 18:32:37.425097 4878 generic.go:334] "Generic (PLEG): container finished" podID="d7581b57-41bb-4843-89cb-00b1966ccd8e" containerID="e2976ec34352416a2dec225de01643ed16e5d68e477d4c02859a73024c33ad99" exitCode=0 Dec 02 18:32:37 crc kubenswrapper[4878]: I1202 18:32:37.425161 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerDied","Data":"e2976ec34352416a2dec225de01643ed16e5d68e477d4c02859a73024c33ad99"} Dec 02 18:32:38 crc kubenswrapper[4878]: I1202 18:32:38.438069 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerStarted","Data":"318f3f14ab042e89e7388b312708546128b72a8b4aed264ec87d8a2e487d2ae3"} Dec 02 18:32:38 crc kubenswrapper[4878]: I1202 18:32:38.438431 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerStarted","Data":"35a4cf038ebb0f189e37572cfa11902cb225911ca551537a4372c7e2ae231282"} Dec 02 18:32:38 crc kubenswrapper[4878]: I1202 18:32:38.438440 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerStarted","Data":"e43d2d99c2b9f1d987a3bf7ef86415200dde0cbd01e1bb04a05ac156e2070667"} Dec 02 18:32:38 crc kubenswrapper[4878]: I1202 18:32:38.438449 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerStarted","Data":"a090465777b95761b086694847910f8c2db84811c8ffba54d7205dcb1de31297"} Dec 02 18:32:39 crc kubenswrapper[4878]: I1202 18:32:39.456920 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerStarted","Data":"f73b7490f2c47284e97c14af85cc1d093550df6ec3c91e8d999f32ec87dc3261"} Dec 02 18:32:39 crc kubenswrapper[4878]: I1202 18:32:39.457541 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:39 crc kubenswrapper[4878]: I1202 18:32:39.457565 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2qjj9" event={"ID":"d7581b57-41bb-4843-89cb-00b1966ccd8e","Type":"ContainerStarted","Data":"4a6e00d01500d8ed79b3e129ef40a6679a35fe4e8ebb4820e40d9d47aed64d6e"} Dec 02 18:32:39 crc kubenswrapper[4878]: I1202 18:32:39.500999 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2qjj9" podStartSLOduration=6.944359399 podStartE2EDuration="16.500983038s" podCreationTimestamp="2025-12-02 18:32:23 +0000 UTC" firstStartedPulling="2025-12-02 18:32:24.912291757 +0000 UTC m=+1054.601910638" lastFinishedPulling="2025-12-02 18:32:34.468915386 +0000 UTC m=+1064.158534277" observedRunningTime="2025-12-02 18:32:39.49819602 +0000 UTC m=+1069.187814901" watchObservedRunningTime="2025-12-02 18:32:39.500983038 +0000 UTC m=+1069.190601919" Dec 02 18:32:39 crc kubenswrapper[4878]: I1202 18:32:39.665550 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:39 crc kubenswrapper[4878]: I1202 18:32:39.707069 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:44 crc kubenswrapper[4878]: I1202 18:32:44.689162 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-p7f4t" Dec 02 18:32:47 crc kubenswrapper[4878]: I1202 18:32:47.983830 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4q4nt" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.013184 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vdknx"] Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.017109 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vdknx" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.019497 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.019585 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.019928 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zzshz" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.023564 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vdknx"] Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.138912 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-558ct\" (UniqueName: \"kubernetes.io/projected/027ef7d7-f3b0-4556-af30-bd222528dbc1-kube-api-access-558ct\") pod \"openstack-operator-index-vdknx\" (UID: \"027ef7d7-f3b0-4556-af30-bd222528dbc1\") " pod="openstack-operators/openstack-operator-index-vdknx" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.240838 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-558ct\" (UniqueName: \"kubernetes.io/projected/027ef7d7-f3b0-4556-af30-bd222528dbc1-kube-api-access-558ct\") pod \"openstack-operator-index-vdknx\" (UID: \"027ef7d7-f3b0-4556-af30-bd222528dbc1\") " pod="openstack-operators/openstack-operator-index-vdknx" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.263645 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-558ct\" (UniqueName: \"kubernetes.io/projected/027ef7d7-f3b0-4556-af30-bd222528dbc1-kube-api-access-558ct\") pod \"openstack-operator-index-vdknx\" (UID: \"027ef7d7-f3b0-4556-af30-bd222528dbc1\") " pod="openstack-operators/openstack-operator-index-vdknx" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.345218 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vdknx" Dec 02 18:32:51 crc kubenswrapper[4878]: I1202 18:32:51.851802 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vdknx"] Dec 02 18:32:51 crc kubenswrapper[4878]: W1202 18:32:51.855522 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027ef7d7_f3b0_4556_af30_bd222528dbc1.slice/crio-7f156c5b387e800aeb930db8506ffdfdb44d2024633d6ee8b36fa6b148465247 WatchSource:0}: Error finding container 7f156c5b387e800aeb930db8506ffdfdb44d2024633d6ee8b36fa6b148465247: Status 404 returned error can't find the container with id 7f156c5b387e800aeb930db8506ffdfdb44d2024633d6ee8b36fa6b148465247 Dec 02 18:32:52 crc kubenswrapper[4878]: I1202 18:32:52.586434 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vdknx" event={"ID":"027ef7d7-f3b0-4556-af30-bd222528dbc1","Type":"ContainerStarted","Data":"7f156c5b387e800aeb930db8506ffdfdb44d2024633d6ee8b36fa6b148465247"} Dec 02 18:32:54 crc kubenswrapper[4878]: I1202 18:32:54.361309 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vdknx"] Dec 02 18:32:54 crc kubenswrapper[4878]: I1202 18:32:54.669891 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2qjj9" Dec 02 18:32:54 crc kubenswrapper[4878]: I1202 18:32:54.974596 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dfbvm"] Dec 02 18:32:54 crc kubenswrapper[4878]: I1202 18:32:54.976597 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:32:54 crc kubenswrapper[4878]: I1202 18:32:54.989600 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dfbvm"] Dec 02 18:32:55 crc kubenswrapper[4878]: I1202 18:32:55.121001 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dczv9\" (UniqueName: \"kubernetes.io/projected/73939207-5e4c-4ef0-ba00-efa6b403e4c7-kube-api-access-dczv9\") pod \"openstack-operator-index-dfbvm\" (UID: \"73939207-5e4c-4ef0-ba00-efa6b403e4c7\") " pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:32:55 crc kubenswrapper[4878]: I1202 18:32:55.222916 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dczv9\" (UniqueName: \"kubernetes.io/projected/73939207-5e4c-4ef0-ba00-efa6b403e4c7-kube-api-access-dczv9\") pod \"openstack-operator-index-dfbvm\" (UID: \"73939207-5e4c-4ef0-ba00-efa6b403e4c7\") " pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:32:55 crc kubenswrapper[4878]: I1202 18:32:55.244532 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dczv9\" (UniqueName: \"kubernetes.io/projected/73939207-5e4c-4ef0-ba00-efa6b403e4c7-kube-api-access-dczv9\") pod \"openstack-operator-index-dfbvm\" (UID: \"73939207-5e4c-4ef0-ba00-efa6b403e4c7\") " pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:32:55 crc kubenswrapper[4878]: I1202 18:32:55.305029 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:32:56 crc kubenswrapper[4878]: I1202 18:32:56.206734 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dfbvm"] Dec 02 18:32:56 crc kubenswrapper[4878]: I1202 18:32:56.618203 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vdknx" event={"ID":"027ef7d7-f3b0-4556-af30-bd222528dbc1","Type":"ContainerStarted","Data":"ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e"} Dec 02 18:32:56 crc kubenswrapper[4878]: I1202 18:32:56.618319 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vdknx" podUID="027ef7d7-f3b0-4556-af30-bd222528dbc1" containerName="registry-server" containerID="cri-o://ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e" gracePeriod=2 Dec 02 18:32:56 crc kubenswrapper[4878]: I1202 18:32:56.620100 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dfbvm" event={"ID":"73939207-5e4c-4ef0-ba00-efa6b403e4c7","Type":"ContainerStarted","Data":"17d685d2837e4e43d2ad621cbedacf7ad7f4741619621a97c7d29870c7f4b66c"} Dec 02 18:32:56 crc kubenswrapper[4878]: I1202 18:32:56.620130 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dfbvm" event={"ID":"73939207-5e4c-4ef0-ba00-efa6b403e4c7","Type":"ContainerStarted","Data":"002227b99fbe344dc3e9c972bb451489f6fcfa0e947d7c9c01a5f37f3197dbd6"} Dec 02 18:32:56 crc kubenswrapper[4878]: I1202 18:32:56.638939 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vdknx" podStartSLOduration=2.711517535 podStartE2EDuration="6.638917782s" podCreationTimestamp="2025-12-02 18:32:50 +0000 UTC" firstStartedPulling="2025-12-02 18:32:51.864050004 +0000 UTC m=+1081.553668915" lastFinishedPulling="2025-12-02 18:32:55.791450281 +0000 UTC m=+1085.481069162" observedRunningTime="2025-12-02 18:32:56.633995304 +0000 UTC m=+1086.323614205" watchObservedRunningTime="2025-12-02 18:32:56.638917782 +0000 UTC m=+1086.328536663" Dec 02 18:32:56 crc kubenswrapper[4878]: I1202 18:32:56.661077 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dfbvm" podStartSLOduration=2.591188181 podStartE2EDuration="2.66105052s" podCreationTimestamp="2025-12-02 18:32:54 +0000 UTC" firstStartedPulling="2025-12-02 18:32:56.219336038 +0000 UTC m=+1085.908954919" lastFinishedPulling="2025-12-02 18:32:56.289198387 +0000 UTC m=+1085.978817258" observedRunningTime="2025-12-02 18:32:56.652498291 +0000 UTC m=+1086.342117182" watchObservedRunningTime="2025-12-02 18:32:56.66105052 +0000 UTC m=+1086.350669401" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.087776 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vdknx" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.167204 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-558ct\" (UniqueName: \"kubernetes.io/projected/027ef7d7-f3b0-4556-af30-bd222528dbc1-kube-api-access-558ct\") pod \"027ef7d7-f3b0-4556-af30-bd222528dbc1\" (UID: \"027ef7d7-f3b0-4556-af30-bd222528dbc1\") " Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.172648 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027ef7d7-f3b0-4556-af30-bd222528dbc1-kube-api-access-558ct" (OuterVolumeSpecName: "kube-api-access-558ct") pod "027ef7d7-f3b0-4556-af30-bd222528dbc1" (UID: "027ef7d7-f3b0-4556-af30-bd222528dbc1"). InnerVolumeSpecName "kube-api-access-558ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.269490 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-558ct\" (UniqueName: \"kubernetes.io/projected/027ef7d7-f3b0-4556-af30-bd222528dbc1-kube-api-access-558ct\") on node \"crc\" DevicePath \"\"" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.631122 4878 generic.go:334] "Generic (PLEG): container finished" podID="027ef7d7-f3b0-4556-af30-bd222528dbc1" containerID="ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e" exitCode=0 Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.631204 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vdknx" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.631302 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vdknx" event={"ID":"027ef7d7-f3b0-4556-af30-bd222528dbc1","Type":"ContainerDied","Data":"ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e"} Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.631363 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vdknx" event={"ID":"027ef7d7-f3b0-4556-af30-bd222528dbc1","Type":"ContainerDied","Data":"7f156c5b387e800aeb930db8506ffdfdb44d2024633d6ee8b36fa6b148465247"} Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.631397 4878 scope.go:117] "RemoveContainer" containerID="ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.662162 4878 scope.go:117] "RemoveContainer" containerID="ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e" Dec 02 18:32:57 crc kubenswrapper[4878]: E1202 18:32:57.662974 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e\": container with ID starting with ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e not found: ID does not exist" containerID="ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.663051 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e"} err="failed to get container status \"ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e\": rpc error: code = NotFound desc = could not find container \"ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e\": container with ID starting with ea6e0f420f2fdc0c8b7cff6c9e740a13a791d94406943f351dc55188c80c221e not found: ID does not exist" Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.682556 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vdknx"] Dec 02 18:32:57 crc kubenswrapper[4878]: I1202 18:32:57.690345 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vdknx"] Dec 02 18:32:58 crc kubenswrapper[4878]: I1202 18:32:58.957904 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027ef7d7-f3b0-4556-af30-bd222528dbc1" path="/var/lib/kubelet/pods/027ef7d7-f3b0-4556-af30-bd222528dbc1/volumes" Dec 02 18:33:05 crc kubenswrapper[4878]: I1202 18:33:05.305702 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:33:05 crc kubenswrapper[4878]: I1202 18:33:05.306522 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:33:05 crc kubenswrapper[4878]: I1202 18:33:05.348605 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:33:05 crc kubenswrapper[4878]: I1202 18:33:05.769759 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-dfbvm" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.815965 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf"] Dec 02 18:33:06 crc kubenswrapper[4878]: E1202 18:33:06.816508 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027ef7d7-f3b0-4556-af30-bd222528dbc1" containerName="registry-server" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.816530 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="027ef7d7-f3b0-4556-af30-bd222528dbc1" containerName="registry-server" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.816797 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="027ef7d7-f3b0-4556-af30-bd222528dbc1" containerName="registry-server" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.818769 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.822994 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-h67d4" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.836534 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf"] Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.893115 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-util\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.893174 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-bundle\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:06 crc kubenswrapper[4878]: I1202 18:33:06.893252 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zzp\" (UniqueName: \"kubernetes.io/projected/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-kube-api-access-k6zzp\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.010886 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-util\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.011097 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-bundle\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.011267 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zzp\" (UniqueName: \"kubernetes.io/projected/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-kube-api-access-k6zzp\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.011904 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-bundle\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.018486 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-util\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.035366 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zzp\" (UniqueName: \"kubernetes.io/projected/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-kube-api-access-k6zzp\") pod \"a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.144821 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.595355 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf"] Dec 02 18:33:07 crc kubenswrapper[4878]: I1202 18:33:07.732620 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" event={"ID":"36734ca4-54d4-4d24-a3ba-bbd876acbe2e","Type":"ContainerStarted","Data":"0f6881f884f9904d87d0379c8f47f0e5224f7cdb2d320ba8a947ae1c634a8ad3"} Dec 02 18:33:08 crc kubenswrapper[4878]: I1202 18:33:08.749497 4878 generic.go:334] "Generic (PLEG): container finished" podID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerID="3cb5ab0444f66ea3f1ed4dc66d4a50804b2a5d4d800819baf970431750e5e14a" exitCode=0 Dec 02 18:33:08 crc kubenswrapper[4878]: I1202 18:33:08.749585 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" event={"ID":"36734ca4-54d4-4d24-a3ba-bbd876acbe2e","Type":"ContainerDied","Data":"3cb5ab0444f66ea3f1ed4dc66d4a50804b2a5d4d800819baf970431750e5e14a"} Dec 02 18:33:09 crc kubenswrapper[4878]: I1202 18:33:09.765211 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" event={"ID":"36734ca4-54d4-4d24-a3ba-bbd876acbe2e","Type":"ContainerStarted","Data":"d1c08edf21bce4338296536412b439062ce69cf93c2c06f8657b5ebdd2ccce30"} Dec 02 18:33:10 crc kubenswrapper[4878]: I1202 18:33:10.777703 4878 generic.go:334] "Generic (PLEG): container finished" podID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerID="d1c08edf21bce4338296536412b439062ce69cf93c2c06f8657b5ebdd2ccce30" exitCode=0 Dec 02 18:33:10 crc kubenswrapper[4878]: I1202 18:33:10.777755 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" event={"ID":"36734ca4-54d4-4d24-a3ba-bbd876acbe2e","Type":"ContainerDied","Data":"d1c08edf21bce4338296536412b439062ce69cf93c2c06f8657b5ebdd2ccce30"} Dec 02 18:33:11 crc kubenswrapper[4878]: I1202 18:33:11.789756 4878 generic.go:334] "Generic (PLEG): container finished" podID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerID="ac9836df55febae9b11394d54eb1bac10a6ff669b99da65d850b9f8115733e42" exitCode=0 Dec 02 18:33:11 crc kubenswrapper[4878]: I1202 18:33:11.789829 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" event={"ID":"36734ca4-54d4-4d24-a3ba-bbd876acbe2e","Type":"ContainerDied","Data":"ac9836df55febae9b11394d54eb1bac10a6ff669b99da65d850b9f8115733e42"} Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.171796 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.235083 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-util\") pod \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.235370 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6zzp\" (UniqueName: \"kubernetes.io/projected/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-kube-api-access-k6zzp\") pod \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.235434 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-bundle\") pod \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\" (UID: \"36734ca4-54d4-4d24-a3ba-bbd876acbe2e\") " Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.236934 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-bundle" (OuterVolumeSpecName: "bundle") pod "36734ca4-54d4-4d24-a3ba-bbd876acbe2e" (UID: "36734ca4-54d4-4d24-a3ba-bbd876acbe2e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.242354 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-kube-api-access-k6zzp" (OuterVolumeSpecName: "kube-api-access-k6zzp") pod "36734ca4-54d4-4d24-a3ba-bbd876acbe2e" (UID: "36734ca4-54d4-4d24-a3ba-bbd876acbe2e"). InnerVolumeSpecName "kube-api-access-k6zzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.248199 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-util" (OuterVolumeSpecName: "util") pod "36734ca4-54d4-4d24-a3ba-bbd876acbe2e" (UID: "36734ca4-54d4-4d24-a3ba-bbd876acbe2e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.337759 4878 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.337844 4878 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-util\") on node \"crc\" DevicePath \"\"" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.337867 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6zzp\" (UniqueName: \"kubernetes.io/projected/36734ca4-54d4-4d24-a3ba-bbd876acbe2e-kube-api-access-k6zzp\") on node \"crc\" DevicePath \"\"" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.813790 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" event={"ID":"36734ca4-54d4-4d24-a3ba-bbd876acbe2e","Type":"ContainerDied","Data":"0f6881f884f9904d87d0379c8f47f0e5224f7cdb2d320ba8a947ae1c634a8ad3"} Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.814564 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6881f884f9904d87d0379c8f47f0e5224f7cdb2d320ba8a947ae1c634a8ad3" Dec 02 18:33:13 crc kubenswrapper[4878]: I1202 18:33:13.813858 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.880108 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz"] Dec 02 18:33:18 crc kubenswrapper[4878]: E1202 18:33:18.881251 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerName="pull" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.881273 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerName="pull" Dec 02 18:33:18 crc kubenswrapper[4878]: E1202 18:33:18.881322 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerName="extract" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.881331 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerName="extract" Dec 02 18:33:18 crc kubenswrapper[4878]: E1202 18:33:18.881353 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerName="util" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.881361 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerName="util" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.881618 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="36734ca4-54d4-4d24-a3ba-bbd876acbe2e" containerName="extract" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.882536 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.884740 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-8p9lx" Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.920255 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz"] Dec 02 18:33:18 crc kubenswrapper[4878]: I1202 18:33:18.958082 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q65rs\" (UniqueName: \"kubernetes.io/projected/ce506db4-fc2d-45a6-b9c1-23d22cc536cc-kube-api-access-q65rs\") pod \"openstack-operator-controller-operator-6db445db9f-m4wkz\" (UID: \"ce506db4-fc2d-45a6-b9c1-23d22cc536cc\") " pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" Dec 02 18:33:19 crc kubenswrapper[4878]: I1202 18:33:19.060245 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q65rs\" (UniqueName: \"kubernetes.io/projected/ce506db4-fc2d-45a6-b9c1-23d22cc536cc-kube-api-access-q65rs\") pod \"openstack-operator-controller-operator-6db445db9f-m4wkz\" (UID: \"ce506db4-fc2d-45a6-b9c1-23d22cc536cc\") " pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" Dec 02 18:33:19 crc kubenswrapper[4878]: I1202 18:33:19.086875 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q65rs\" (UniqueName: \"kubernetes.io/projected/ce506db4-fc2d-45a6-b9c1-23d22cc536cc-kube-api-access-q65rs\") pod \"openstack-operator-controller-operator-6db445db9f-m4wkz\" (UID: \"ce506db4-fc2d-45a6-b9c1-23d22cc536cc\") " pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" Dec 02 18:33:19 crc kubenswrapper[4878]: I1202 18:33:19.220882 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" Dec 02 18:33:19 crc kubenswrapper[4878]: I1202 18:33:19.684893 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz"] Dec 02 18:33:19 crc kubenswrapper[4878]: I1202 18:33:19.883475 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" event={"ID":"ce506db4-fc2d-45a6-b9c1-23d22cc536cc","Type":"ContainerStarted","Data":"6ebb4d1e0d393426437b223284ac48a139cb5d903610507544ccd6e731808440"} Dec 02 18:33:27 crc kubenswrapper[4878]: I1202 18:33:27.168666 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" event={"ID":"ce506db4-fc2d-45a6-b9c1-23d22cc536cc","Type":"ContainerStarted","Data":"76ee11f6dc4bd60b7d5fee74dd77571a499c031274da1a5c41d37e2ffd7723f7"} Dec 02 18:33:27 crc kubenswrapper[4878]: I1202 18:33:27.169343 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" Dec 02 18:33:27 crc kubenswrapper[4878]: I1202 18:33:27.205430 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" podStartSLOduration=2.773301126 podStartE2EDuration="9.205409199s" podCreationTimestamp="2025-12-02 18:33:18 +0000 UTC" firstStartedPulling="2025-12-02 18:33:19.685653268 +0000 UTC m=+1109.375272159" lastFinishedPulling="2025-12-02 18:33:26.117761341 +0000 UTC m=+1115.807380232" observedRunningTime="2025-12-02 18:33:27.194401028 +0000 UTC m=+1116.884019919" watchObservedRunningTime="2025-12-02 18:33:27.205409199 +0000 UTC m=+1116.895028090" Dec 02 18:33:39 crc kubenswrapper[4878]: I1202 18:33:39.228062 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6db445db9f-m4wkz" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.402962 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.404993 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.409786 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-k6lzd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.414619 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.417937 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.420523 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xpmbq" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.435558 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzlgm\" (UniqueName: \"kubernetes.io/projected/0e516f0b-2b62-4d60-b1bd-07404ffcdea9-kube-api-access-qzlgm\") pod \"cinder-operator-controller-manager-859b6ccc6-ds92l\" (UID: \"0e516f0b-2b62-4d60-b1bd-07404ffcdea9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.435708 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6j6g\" (UniqueName: \"kubernetes.io/projected/81eba8a0-84f6-4456-9484-dfa84dda8e10-kube-api-access-s6j6g\") pod \"barbican-operator-controller-manager-7d9dfd778-rgj5c\" (UID: \"81eba8a0-84f6-4456-9484-dfa84dda8e10\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.444604 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.460898 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.462473 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.465921 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-s8279" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.470563 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.537507 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.539119 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4ng\" (UniqueName: \"kubernetes.io/projected/77e2f2de-8d3f-437b-8f32-7b76ea70ccda-kube-api-access-sb4ng\") pod \"designate-operator-controller-manager-78b4bc895b-kwqck\" (UID: \"77e2f2de-8d3f-437b-8f32-7b76ea70ccda\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.539208 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzlgm\" (UniqueName: \"kubernetes.io/projected/0e516f0b-2b62-4d60-b1bd-07404ffcdea9-kube-api-access-qzlgm\") pod \"cinder-operator-controller-manager-859b6ccc6-ds92l\" (UID: \"0e516f0b-2b62-4d60-b1bd-07404ffcdea9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.539289 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6j6g\" (UniqueName: \"kubernetes.io/projected/81eba8a0-84f6-4456-9484-dfa84dda8e10-kube-api-access-s6j6g\") pod \"barbican-operator-controller-manager-7d9dfd778-rgj5c\" (UID: \"81eba8a0-84f6-4456-9484-dfa84dda8e10\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.546753 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.548642 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.553979 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.554662 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-zmvpt" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.555571 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.559300 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-bdjkt" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.586747 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzlgm\" (UniqueName: \"kubernetes.io/projected/0e516f0b-2b62-4d60-b1bd-07404ffcdea9-kube-api-access-qzlgm\") pod \"cinder-operator-controller-manager-859b6ccc6-ds92l\" (UID: \"0e516f0b-2b62-4d60-b1bd-07404ffcdea9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.588347 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6j6g\" (UniqueName: \"kubernetes.io/projected/81eba8a0-84f6-4456-9484-dfa84dda8e10-kube-api-access-s6j6g\") pod \"barbican-operator-controller-manager-7d9dfd778-rgj5c\" (UID: \"81eba8a0-84f6-4456-9484-dfa84dda8e10\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.588400 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.589887 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.600144 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qmrpw" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.600306 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.608569 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.620307 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.633219 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.635011 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.639764 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.641220 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdxf4\" (UniqueName: \"kubernetes.io/projected/d7143587-e348-48b5-9164-a4d477b4a259-kube-api-access-sdxf4\") pod \"horizon-operator-controller-manager-68c6d99b8f-vc67f\" (UID: \"d7143587-e348-48b5-9164-a4d477b4a259\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.641276 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfkw\" (UniqueName: \"kubernetes.io/projected/a128e6b1-604f-4d2d-9b31-1567ade115df-kube-api-access-qhfkw\") pod \"heat-operator-controller-manager-5f64f6f8bb-fx64f\" (UID: \"a128e6b1-604f-4d2d-9b31-1567ade115df\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.641333 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4ng\" (UniqueName: \"kubernetes.io/projected/77e2f2de-8d3f-437b-8f32-7b76ea70ccda-kube-api-access-sb4ng\") pod \"designate-operator-controller-manager-78b4bc895b-kwqck\" (UID: \"77e2f2de-8d3f-437b-8f32-7b76ea70ccda\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.641367 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5rx\" (UniqueName: \"kubernetes.io/projected/fc38a188-1850-41eb-a958-fd1fe01270c7-kube-api-access-ng5rx\") pod \"glance-operator-controller-manager-77987cd8cd-4gsxd\" (UID: \"fc38a188-1850-41eb-a958-fd1fe01270c7\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.641989 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.645779 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wg9pm" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.652447 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cmcg7" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.652661 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.668472 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.717447 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.718992 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.719825 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4ng\" (UniqueName: \"kubernetes.io/projected/77e2f2de-8d3f-437b-8f32-7b76ea70ccda-kube-api-access-sb4ng\") pod \"designate-operator-controller-manager-78b4bc895b-kwqck\" (UID: \"77e2f2de-8d3f-437b-8f32-7b76ea70ccda\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.733685 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-768jk" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.735813 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.749263 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.750495 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.750529 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5rx\" (UniqueName: \"kubernetes.io/projected/fc38a188-1850-41eb-a958-fd1fe01270c7-kube-api-access-ng5rx\") pod \"glance-operator-controller-manager-77987cd8cd-4gsxd\" (UID: \"fc38a188-1850-41eb-a958-fd1fe01270c7\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.750565 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nssf7\" (UniqueName: \"kubernetes.io/projected/3028ad1d-cba5-4197-964f-6405fb1cc1c3-kube-api-access-nssf7\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.750607 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj8pf\" (UniqueName: \"kubernetes.io/projected/492dff42-bb87-4c30-8f81-02406308904c-kube-api-access-jj8pf\") pod \"mariadb-operator-controller-manager-56bbcc9d85-952z9\" (UID: \"492dff42-bb87-4c30-8f81-02406308904c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.750636 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp24\" (UniqueName: \"kubernetes.io/projected/7dfef70a-0da9-4ad6-9fda-1cac674c9ddb-kube-api-access-gwp24\") pod \"ironic-operator-controller-manager-6c548fd776-rqzqn\" (UID: \"7dfef70a-0da9-4ad6-9fda-1cac674c9ddb\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.750661 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdxf4\" (UniqueName: \"kubernetes.io/projected/d7143587-e348-48b5-9164-a4d477b4a259-kube-api-access-sdxf4\") pod \"horizon-operator-controller-manager-68c6d99b8f-vc67f\" (UID: \"d7143587-e348-48b5-9164-a4d477b4a259\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.750685 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfkw\" (UniqueName: \"kubernetes.io/projected/a128e6b1-604f-4d2d-9b31-1567ade115df-kube-api-access-qhfkw\") pod \"heat-operator-controller-manager-5f64f6f8bb-fx64f\" (UID: \"a128e6b1-604f-4d2d-9b31-1567ade115df\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.751552 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.753785 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.762630 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-lgtnl" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.777774 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.786166 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5rx\" (UniqueName: \"kubernetes.io/projected/fc38a188-1850-41eb-a958-fd1fe01270c7-kube-api-access-ng5rx\") pod \"glance-operator-controller-manager-77987cd8cd-4gsxd\" (UID: \"fc38a188-1850-41eb-a958-fd1fe01270c7\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.809747 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.811229 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.824355 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfkw\" (UniqueName: \"kubernetes.io/projected/a128e6b1-604f-4d2d-9b31-1567ade115df-kube-api-access-qhfkw\") pod \"heat-operator-controller-manager-5f64f6f8bb-fx64f\" (UID: \"a128e6b1-604f-4d2d-9b31-1567ade115df\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.826760 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2ch6x" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.831492 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdxf4\" (UniqueName: \"kubernetes.io/projected/d7143587-e348-48b5-9164-a4d477b4a259-kube-api-access-sdxf4\") pod \"horizon-operator-controller-manager-68c6d99b8f-vc67f\" (UID: \"d7143587-e348-48b5-9164-a4d477b4a259\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.854758 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nssf7\" (UniqueName: \"kubernetes.io/projected/3028ad1d-cba5-4197-964f-6405fb1cc1c3-kube-api-access-nssf7\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.856212 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9drv\" (UniqueName: \"kubernetes.io/projected/5cee25c6-1e94-400c-afd8-c1e75f31e619-kube-api-access-r9drv\") pod \"manila-operator-controller-manager-7c79b5df47-4c7fq\" (UID: \"5cee25c6-1e94-400c-afd8-c1e75f31e619\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.856463 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj8pf\" (UniqueName: \"kubernetes.io/projected/492dff42-bb87-4c30-8f81-02406308904c-kube-api-access-jj8pf\") pod \"mariadb-operator-controller-manager-56bbcc9d85-952z9\" (UID: \"492dff42-bb87-4c30-8f81-02406308904c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.857364 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp24\" (UniqueName: \"kubernetes.io/projected/7dfef70a-0da9-4ad6-9fda-1cac674c9ddb-kube-api-access-gwp24\") pod \"ironic-operator-controller-manager-6c548fd776-rqzqn\" (UID: \"7dfef70a-0da9-4ad6-9fda-1cac674c9ddb\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.857467 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.857733 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.857854 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vzx9\" (UniqueName: \"kubernetes.io/projected/e51c89b2-70ae-4c1d-81b0-8aba6e211dd0-kube-api-access-2vzx9\") pod \"keystone-operator-controller-manager-7765d96ddf-kmkzd\" (UID: \"e51c89b2-70ae-4c1d-81b0-8aba6e211dd0\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" Dec 02 18:33:58 crc kubenswrapper[4878]: E1202 18:33:58.859324 4878 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 18:33:58 crc kubenswrapper[4878]: E1202 18:33:58.859501 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert podName:3028ad1d-cba5-4197-964f-6405fb1cc1c3 nodeName:}" failed. No retries permitted until 2025-12-02 18:33:59.359482352 +0000 UTC m=+1149.049101233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert") pod "infra-operator-controller-manager-57548d458d-cmzpg" (UID: "3028ad1d-cba5-4197-964f-6405fb1cc1c3") : secret "infra-operator-webhook-server-cert" not found Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.867897 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.890362 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.912621 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj8pf\" (UniqueName: \"kubernetes.io/projected/492dff42-bb87-4c30-8f81-02406308904c-kube-api-access-jj8pf\") pod \"mariadb-operator-controller-manager-56bbcc9d85-952z9\" (UID: \"492dff42-bb87-4c30-8f81-02406308904c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.912804 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq"] Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.932963 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nssf7\" (UniqueName: \"kubernetes.io/projected/3028ad1d-cba5-4197-964f-6405fb1cc1c3-kube-api-access-nssf7\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.943585 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.975462 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vzx9\" (UniqueName: \"kubernetes.io/projected/e51c89b2-70ae-4c1d-81b0-8aba6e211dd0-kube-api-access-2vzx9\") pod \"keystone-operator-controller-manager-7765d96ddf-kmkzd\" (UID: \"e51c89b2-70ae-4c1d-81b0-8aba6e211dd0\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.975592 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9drv\" (UniqueName: \"kubernetes.io/projected/5cee25c6-1e94-400c-afd8-c1e75f31e619-kube-api-access-r9drv\") pod \"manila-operator-controller-manager-7c79b5df47-4c7fq\" (UID: \"5cee25c6-1e94-400c-afd8-c1e75f31e619\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" Dec 02 18:33:58 crc kubenswrapper[4878]: I1202 18:33:58.978169 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.014590 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp24\" (UniqueName: \"kubernetes.io/projected/7dfef70a-0da9-4ad6-9fda-1cac674c9ddb-kube-api-access-gwp24\") pod \"ironic-operator-controller-manager-6c548fd776-rqzqn\" (UID: \"7dfef70a-0da9-4ad6-9fda-1cac674c9ddb\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.026392 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vzx9\" (UniqueName: \"kubernetes.io/projected/e51c89b2-70ae-4c1d-81b0-8aba6e211dd0-kube-api-access-2vzx9\") pod \"keystone-operator-controller-manager-7765d96ddf-kmkzd\" (UID: \"e51c89b2-70ae-4c1d-81b0-8aba6e211dd0\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.042146 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9drv\" (UniqueName: \"kubernetes.io/projected/5cee25c6-1e94-400c-afd8-c1e75f31e619-kube-api-access-r9drv\") pod \"manila-operator-controller-manager-7c79b5df47-4c7fq\" (UID: \"5cee25c6-1e94-400c-afd8-c1e75f31e619\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.059278 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.080988 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.087103 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.093809 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.100849 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-t5m4g" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.102228 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.142326 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.158143 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.167295 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9stj8"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.168994 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.177129 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.189124 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnmt\" (UniqueName: \"kubernetes.io/projected/ed599489-c1f6-440f-aaf3-339f424cbcdf-kube-api-access-zdnmt\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-f58x6\" (UID: \"ed599489-c1f6-440f-aaf3-339f424cbcdf\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.210752 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9stj8"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.210787 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.210865 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.211475 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-55v8f" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.219670 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.219821 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-j2bqc" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.221427 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.224094 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.224404 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hjjbk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.232419 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.233970 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.236626 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-bbdqr" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.255996 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.266767 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.281341 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-6qznx"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.288559 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.296087 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.302256 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnmt\" (UniqueName: \"kubernetes.io/projected/ed599489-c1f6-440f-aaf3-339f424cbcdf-kube-api-access-zdnmt\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-f58x6\" (UID: \"ed599489-c1f6-440f-aaf3-339f424cbcdf\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.302443 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-75tc6" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.302536 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfgm\" (UniqueName: \"kubernetes.io/projected/56f1dd36-8cc9-4026-8976-8816940217a4-kube-api-access-mrfgm\") pod \"ovn-operator-controller-manager-b6456fdb6-w99w9\" (UID: \"56f1dd36-8cc9-4026-8976-8816940217a4\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.302595 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.302614 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt55q\" (UniqueName: \"kubernetes.io/projected/095b515f-0144-4f7b-b3ab-9ca3440921db-kube-api-access-zt55q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.302720 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjkzs\" (UniqueName: \"kubernetes.io/projected/84d9833f-9760-47ea-ba43-f385b24a3e57-kube-api-access-qjkzs\") pod \"nova-operator-controller-manager-697bc559fc-xzpbx\" (UID: \"84d9833f-9760-47ea-ba43-f385b24a3e57\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.302772 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdpj4\" (UniqueName: \"kubernetes.io/projected/46cfeb81-3e48-4f16-ae55-aabe49810afb-kube-api-access-qdpj4\") pod \"octavia-operator-controller-manager-998648c74-9stj8\" (UID: \"46cfeb81-3e48-4f16-ae55-aabe49810afb\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.334148 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.336418 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.347706 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-b4tw4" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.349291 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnmt\" (UniqueName: \"kubernetes.io/projected/ed599489-c1f6-440f-aaf3-339f424cbcdf-kube-api-access-zdnmt\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-f58x6\" (UID: \"ed599489-c1f6-440f-aaf3-339f424cbcdf\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.365359 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.366996 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.383012 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xfg4k" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.383368 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.385016 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.388402 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-sxphs" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.402366 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-6qznx"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.437952 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439103 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkzs\" (UniqueName: \"kubernetes.io/projected/84d9833f-9760-47ea-ba43-f385b24a3e57-kube-api-access-qjkzs\") pod \"nova-operator-controller-manager-697bc559fc-xzpbx\" (UID: \"84d9833f-9760-47ea-ba43-f385b24a3e57\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439194 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdpj4\" (UniqueName: \"kubernetes.io/projected/46cfeb81-3e48-4f16-ae55-aabe49810afb-kube-api-access-qdpj4\") pod \"octavia-operator-controller-manager-998648c74-9stj8\" (UID: \"46cfeb81-3e48-4f16-ae55-aabe49810afb\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439348 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmsh4\" (UniqueName: \"kubernetes.io/projected/5a683e44-012a-41ec-98db-36bcd5646959-kube-api-access-mmsh4\") pod \"swift-operator-controller-manager-5f8c65bbfc-j4dz5\" (UID: \"5a683e44-012a-41ec-98db-36bcd5646959\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439429 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439472 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxxf4\" (UniqueName: \"kubernetes.io/projected/572eebdd-2dc7-4327-a339-4f92e3971d59-kube-api-access-rxxf4\") pod \"test-operator-controller-manager-5854674fcc-fhrfj\" (UID: \"572eebdd-2dc7-4327-a339-4f92e3971d59\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439492 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7jh\" (UniqueName: \"kubernetes.io/projected/98571e6d-dae6-4e83-8d08-e44e8609188f-kube-api-access-hn7jh\") pod \"placement-operator-controller-manager-78f8948974-6qznx\" (UID: \"98571e6d-dae6-4e83-8d08-e44e8609188f\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439516 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzphd\" (UniqueName: \"kubernetes.io/projected/44a363d4-d9a3-44df-8a8f-902cb14a0443-kube-api-access-hzphd\") pod \"telemetry-operator-controller-manager-546f978c55-mvlfw\" (UID: \"44a363d4-d9a3-44df-8a8f-902cb14a0443\") " pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439685 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfgm\" (UniqueName: \"kubernetes.io/projected/56f1dd36-8cc9-4026-8976-8816940217a4-kube-api-access-mrfgm\") pod \"ovn-operator-controller-manager-b6456fdb6-w99w9\" (UID: \"56f1dd36-8cc9-4026-8976-8816940217a4\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439733 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.439753 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt55q\" (UniqueName: \"kubernetes.io/projected/095b515f-0144-4f7b-b3ab-9ca3440921db-kube-api-access-zt55q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.440491 4878 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.440532 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert podName:095b515f-0144-4f7b-b3ab-9ca3440921db nodeName:}" failed. No retries permitted until 2025-12-02 18:33:59.940516095 +0000 UTC m=+1149.630134976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" (UID: "095b515f-0144-4f7b-b3ab-9ca3440921db") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.440556 4878 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.440588 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert podName:3028ad1d-cba5-4197-964f-6405fb1cc1c3 nodeName:}" failed. No retries permitted until 2025-12-02 18:34:00.440577697 +0000 UTC m=+1150.130196578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert") pod "infra-operator-controller-manager-57548d458d-cmzpg" (UID: "3028ad1d-cba5-4197-964f-6405fb1cc1c3") : secret "infra-operator-webhook-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.506947 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.514362 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.520117 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt55q\" (UniqueName: \"kubernetes.io/projected/095b515f-0144-4f7b-b3ab-9ca3440921db-kube-api-access-zt55q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.521426 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfgm\" (UniqueName: \"kubernetes.io/projected/56f1dd36-8cc9-4026-8976-8816940217a4-kube-api-access-mrfgm\") pod \"ovn-operator-controller-manager-b6456fdb6-w99w9\" (UID: \"56f1dd36-8cc9-4026-8976-8816940217a4\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.521877 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjkzs\" (UniqueName: \"kubernetes.io/projected/84d9833f-9760-47ea-ba43-f385b24a3e57-kube-api-access-qjkzs\") pod \"nova-operator-controller-manager-697bc559fc-xzpbx\" (UID: \"84d9833f-9760-47ea-ba43-f385b24a3e57\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.522190 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdpj4\" (UniqueName: \"kubernetes.io/projected/46cfeb81-3e48-4f16-ae55-aabe49810afb-kube-api-access-qdpj4\") pod \"octavia-operator-controller-manager-998648c74-9stj8\" (UID: \"46cfeb81-3e48-4f16-ae55-aabe49810afb\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.548354 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmsh4\" (UniqueName: \"kubernetes.io/projected/5a683e44-012a-41ec-98db-36bcd5646959-kube-api-access-mmsh4\") pod \"swift-operator-controller-manager-5f8c65bbfc-j4dz5\" (UID: \"5a683e44-012a-41ec-98db-36bcd5646959\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.548452 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxxf4\" (UniqueName: \"kubernetes.io/projected/572eebdd-2dc7-4327-a339-4f92e3971d59-kube-api-access-rxxf4\") pod \"test-operator-controller-manager-5854674fcc-fhrfj\" (UID: \"572eebdd-2dc7-4327-a339-4f92e3971d59\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.548492 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7jh\" (UniqueName: \"kubernetes.io/projected/98571e6d-dae6-4e83-8d08-e44e8609188f-kube-api-access-hn7jh\") pod \"placement-operator-controller-manager-78f8948974-6qznx\" (UID: \"98571e6d-dae6-4e83-8d08-e44e8609188f\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.548522 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzphd\" (UniqueName: \"kubernetes.io/projected/44a363d4-d9a3-44df-8a8f-902cb14a0443-kube-api-access-hzphd\") pod \"telemetry-operator-controller-manager-546f978c55-mvlfw\" (UID: \"44a363d4-d9a3-44df-8a8f-902cb14a0443\") " pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.550628 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.562401 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.572986 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.576420 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.579608 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2qcjt" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.580217 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.603083 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmsh4\" (UniqueName: \"kubernetes.io/projected/5a683e44-012a-41ec-98db-36bcd5646959-kube-api-access-mmsh4\") pod \"swift-operator-controller-manager-5f8c65bbfc-j4dz5\" (UID: \"5a683e44-012a-41ec-98db-36bcd5646959\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.603567 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.654609 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpz9f\" (UniqueName: \"kubernetes.io/projected/bfa1dc17-f042-46f8-8bc8-3f8d9e135073-kube-api-access-hpz9f\") pod \"watcher-operator-controller-manager-769dc69bc-hndnh\" (UID: \"bfa1dc17-f042-46f8-8bc8-3f8d9e135073\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.658028 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.660517 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7jh\" (UniqueName: \"kubernetes.io/projected/98571e6d-dae6-4e83-8d08-e44e8609188f-kube-api-access-hn7jh\") pod \"placement-operator-controller-manager-78f8948974-6qznx\" (UID: \"98571e6d-dae6-4e83-8d08-e44e8609188f\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.686785 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.688370 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxxf4\" (UniqueName: \"kubernetes.io/projected/572eebdd-2dc7-4327-a339-4f92e3971d59-kube-api-access-rxxf4\") pod \"test-operator-controller-manager-5854674fcc-fhrfj\" (UID: \"572eebdd-2dc7-4327-a339-4f92e3971d59\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.698748 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzphd\" (UniqueName: \"kubernetes.io/projected/44a363d4-d9a3-44df-8a8f-902cb14a0443-kube-api-access-hzphd\") pod \"telemetry-operator-controller-manager-546f978c55-mvlfw\" (UID: \"44a363d4-d9a3-44df-8a8f-902cb14a0443\") " pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.699864 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.704894 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.705105 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tdb2k" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.705152 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.739919 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.747487 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.758594 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.758720 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zsb2\" (UniqueName: \"kubernetes.io/projected/e18472ba-dc06-4e34-99a7-974d9af72c0a-kube-api-access-5zsb2\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.758818 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpz9f\" (UniqueName: \"kubernetes.io/projected/bfa1dc17-f042-46f8-8bc8-3f8d9e135073-kube-api-access-hpz9f\") pod \"watcher-operator-controller-manager-769dc69bc-hndnh\" (UID: \"bfa1dc17-f042-46f8-8bc8-3f8d9e135073\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.758865 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.759281 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.802087 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.821100 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpz9f\" (UniqueName: \"kubernetes.io/projected/bfa1dc17-f042-46f8-8bc8-3f8d9e135073-kube-api-access-hpz9f\") pod \"watcher-operator-controller-manager-769dc69bc-hndnh\" (UID: \"bfa1dc17-f042-46f8-8bc8-3f8d9e135073\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.875845 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.875924 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.875993 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zsb2\" (UniqueName: \"kubernetes.io/projected/e18472ba-dc06-4e34-99a7-974d9af72c0a-kube-api-access-5zsb2\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.876473 4878 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.876539 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:00.376519279 +0000 UTC m=+1150.066138160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "metrics-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.876778 4878 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.876810 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:00.376802768 +0000 UTC m=+1150.066421639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "webhook-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.893161 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.896909 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.908139 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zsb2\" (UniqueName: \"kubernetes.io/projected/e18472ba-dc06-4e34-99a7-974d9af72c0a-kube-api-access-5zsb2\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.949313 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.950962 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.956088 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zmn4s" Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.968879 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c"] Dec 02 18:33:59 crc kubenswrapper[4878]: I1202 18:33:59.978925 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.979100 4878 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:33:59 crc kubenswrapper[4878]: E1202 18:33:59.979156 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert podName:095b515f-0144-4f7b-b3ab-9ca3440921db nodeName:}" failed. No retries permitted until 2025-12-02 18:34:00.979140233 +0000 UTC m=+1150.668759114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" (UID: "095b515f-0144-4f7b-b3ab-9ca3440921db") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:33:59.998468 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c"] Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.110676 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4rt8\" (UniqueName: \"kubernetes.io/projected/5691d664-31ad-44c8-ab51-11bcf8f9d4c2-kube-api-access-w4rt8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-24n6c\" (UID: \"5691d664-31ad-44c8-ab51-11bcf8f9d4c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.138791 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l"] Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.159331 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck"] Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.214187 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4rt8\" (UniqueName: \"kubernetes.io/projected/5691d664-31ad-44c8-ab51-11bcf8f9d4c2-kube-api-access-w4rt8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-24n6c\" (UID: \"5691d664-31ad-44c8-ab51-11bcf8f9d4c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.249015 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4rt8\" (UniqueName: \"kubernetes.io/projected/5691d664-31ad-44c8-ab51-11bcf8f9d4c2-kube-api-access-w4rt8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-24n6c\" (UID: \"5691d664-31ad-44c8-ab51-11bcf8f9d4c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.281873 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.415928 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.416072 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:00 crc kubenswrapper[4878]: E1202 18:34:00.416227 4878 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 18:34:00 crc kubenswrapper[4878]: E1202 18:34:00.416287 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:01.416271363 +0000 UTC m=+1151.105890244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "metrics-server-cert" not found Dec 02 18:34:00 crc kubenswrapper[4878]: E1202 18:34:00.416698 4878 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 18:34:00 crc kubenswrapper[4878]: E1202 18:34:00.416747 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:01.416737537 +0000 UTC m=+1151.106356418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "webhook-server-cert" not found Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.519348 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:00 crc kubenswrapper[4878]: E1202 18:34:00.519549 4878 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 18:34:00 crc kubenswrapper[4878]: E1202 18:34:00.519597 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert podName:3028ad1d-cba5-4197-964f-6405fb1cc1c3 nodeName:}" failed. No retries permitted until 2025-12-02 18:34:02.519582758 +0000 UTC m=+1152.209201639 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert") pod "infra-operator-controller-manager-57548d458d-cmzpg" (UID: "3028ad1d-cba5-4197-964f-6405fb1cc1c3") : secret "infra-operator-webhook-server-cert" not found Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.841909 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd"] Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.872448 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f"] Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.950441 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" event={"ID":"81eba8a0-84f6-4456-9484-dfa84dda8e10","Type":"ContainerStarted","Data":"32616ea74b9078ecbd3b906a150ba321912dbdf0f5ad9a56a88c5905aa45a46f"} Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.950476 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" event={"ID":"fc38a188-1850-41eb-a958-fd1fe01270c7","Type":"ContainerStarted","Data":"22e8a5893363ef414078b46cc6ce31621e80e399f986bee5dd901cd5b24a958a"} Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.950490 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" event={"ID":"77e2f2de-8d3f-437b-8f32-7b76ea70ccda","Type":"ContainerStarted","Data":"10d82827ee6a154cc111dc932ea6fd894ab7a62f9d8ea50543e7808fd8bb531a"} Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.950499 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" event={"ID":"a128e6b1-604f-4d2d-9b31-1567ade115df","Type":"ContainerStarted","Data":"d29c24a59c4f1a0c8034f482ccc67f8a19771a4e97321d3c523446324ae1ce21"} Dec 02 18:34:00 crc kubenswrapper[4878]: I1202 18:34:00.950509 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" event={"ID":"0e516f0b-2b62-4d60-b1bd-07404ffcdea9","Type":"ContainerStarted","Data":"3790f08546f26acefb40a7d022d2cd08057b846bd26a8675283b16d2ce12c616"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.029940 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:01 crc kubenswrapper[4878]: E1202 18:34:01.030918 4878 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:34:01 crc kubenswrapper[4878]: E1202 18:34:01.030988 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert podName:095b515f-0144-4f7b-b3ab-9ca3440921db nodeName:}" failed. No retries permitted until 2025-12-02 18:34:03.030969671 +0000 UTC m=+1152.720588552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" (UID: "095b515f-0144-4f7b-b3ab-9ca3440921db") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.293930 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq"] Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.351686 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f"] Dec 02 18:34:01 crc kubenswrapper[4878]: W1202 18:34:01.384293 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod492dff42_bb87_4c30_8f81_02406308904c.slice/crio-1922a7bb9657b48272ed47363a8a88d692d2f26b96a2147316a1cc8ffd254ea3 WatchSource:0}: Error finding container 1922a7bb9657b48272ed47363a8a88d692d2f26b96a2147316a1cc8ffd254ea3: Status 404 returned error can't find the container with id 1922a7bb9657b48272ed47363a8a88d692d2f26b96a2147316a1cc8ffd254ea3 Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.394765 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9"] Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.409219 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6"] Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.422161 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn"] Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.463286 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.463386 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:01 crc kubenswrapper[4878]: E1202 18:34:01.463540 4878 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 18:34:01 crc kubenswrapper[4878]: E1202 18:34:01.463609 4878 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 18:34:01 crc kubenswrapper[4878]: E1202 18:34:01.463619 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:03.46359973 +0000 UTC m=+1153.153218611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "metrics-server-cert" not found Dec 02 18:34:01 crc kubenswrapper[4878]: E1202 18:34:01.463662 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:03.463646752 +0000 UTC m=+1153.153265633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "webhook-server-cert" not found Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.675461 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd"] Dec 02 18:34:01 crc kubenswrapper[4878]: W1202 18:34:01.708251 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46cfeb81_3e48_4f16_ae55_aabe49810afb.slice/crio-6b3df0b9ad9564871473d24e01e8c3dde3b9fe9f034106ff00f48a30da56e71d WatchSource:0}: Error finding container 6b3df0b9ad9564871473d24e01e8c3dde3b9fe9f034106ff00f48a30da56e71d: Status 404 returned error can't find the container with id 6b3df0b9ad9564871473d24e01e8c3dde3b9fe9f034106ff00f48a30da56e71d Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.712313 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9stj8"] Dec 02 18:34:01 crc kubenswrapper[4878]: W1202 18:34:01.715368 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d9833f_9760_47ea_ba43_f385b24a3e57.slice/crio-ede3abab0cd9805754eea20074147b37a33f067041e27825a740bf5758d661dc WatchSource:0}: Error finding container ede3abab0cd9805754eea20074147b37a33f067041e27825a740bf5758d661dc: Status 404 returned error can't find the container with id ede3abab0cd9805754eea20074147b37a33f067041e27825a740bf5758d661dc Dec 02 18:34:01 crc kubenswrapper[4878]: W1202 18:34:01.731067 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572eebdd_2dc7_4327_a339_4f92e3971d59.slice/crio-b9c4246923fec56da0d5e2c28279daa6a36f2ed8b61f58f60faa3941f4029ad4 WatchSource:0}: Error finding container b9c4246923fec56da0d5e2c28279daa6a36f2ed8b61f58f60faa3941f4029ad4: Status 404 returned error can't find the container with id b9c4246923fec56da0d5e2c28279daa6a36f2ed8b61f58f60faa3941f4029ad4 Dec 02 18:34:01 crc kubenswrapper[4878]: W1202 18:34:01.734428 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a683e44_012a_41ec_98db_36bcd5646959.slice/crio-dd3c6986528cf5c9540c81a2f17d3b5bf1bd7feb486e0d5ff880dad16f5d21ef WatchSource:0}: Error finding container dd3c6986528cf5c9540c81a2f17d3b5bf1bd7feb486e0d5ff880dad16f5d21ef: Status 404 returned error can't find the container with id dd3c6986528cf5c9540c81a2f17d3b5bf1bd7feb486e0d5ff880dad16f5d21ef Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.741607 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj"] Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.756146 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx"] Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.765012 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5"] Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.960501 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" event={"ID":"46cfeb81-3e48-4f16-ae55-aabe49810afb","Type":"ContainerStarted","Data":"6b3df0b9ad9564871473d24e01e8c3dde3b9fe9f034106ff00f48a30da56e71d"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.962822 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" event={"ID":"5a683e44-012a-41ec-98db-36bcd5646959","Type":"ContainerStarted","Data":"dd3c6986528cf5c9540c81a2f17d3b5bf1bd7feb486e0d5ff880dad16f5d21ef"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.964955 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" event={"ID":"ed599489-c1f6-440f-aaf3-339f424cbcdf","Type":"ContainerStarted","Data":"767125a1c38fe85bbc0ee03850e9972acc0d3fd25d0f12a05c356e8bcf614d22"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.966824 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" event={"ID":"5cee25c6-1e94-400c-afd8-c1e75f31e619","Type":"ContainerStarted","Data":"668aa8596bafd5a435366284e5b32cf9c90518d982731f3b3c5a00ff6261ccfa"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.968539 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" event={"ID":"d7143587-e348-48b5-9164-a4d477b4a259","Type":"ContainerStarted","Data":"cd0122b848e929f67718105e4f419d7b2f22a9df29beedfdad9df149083d26a6"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.969793 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" event={"ID":"572eebdd-2dc7-4327-a339-4f92e3971d59","Type":"ContainerStarted","Data":"b9c4246923fec56da0d5e2c28279daa6a36f2ed8b61f58f60faa3941f4029ad4"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.971458 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" event={"ID":"e51c89b2-70ae-4c1d-81b0-8aba6e211dd0","Type":"ContainerStarted","Data":"000119f7dfe92edd1cf83f54ed51cd392b0b5a1a790598db908c7d2ef97d36c0"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.973302 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" event={"ID":"492dff42-bb87-4c30-8f81-02406308904c","Type":"ContainerStarted","Data":"1922a7bb9657b48272ed47363a8a88d692d2f26b96a2147316a1cc8ffd254ea3"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.974421 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" event={"ID":"84d9833f-9760-47ea-ba43-f385b24a3e57","Type":"ContainerStarted","Data":"ede3abab0cd9805754eea20074147b37a33f067041e27825a740bf5758d661dc"} Dec 02 18:34:01 crc kubenswrapper[4878]: I1202 18:34:01.976731 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" event={"ID":"7dfef70a-0da9-4ad6-9fda-1cac674c9ddb","Type":"ContainerStarted","Data":"a1201c4a96929c7c06383ef2edca4507a5f9262e0302d1c7663bc7bc499cd56d"} Dec 02 18:34:02 crc kubenswrapper[4878]: I1202 18:34:02.031067 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-6qznx"] Dec 02 18:34:02 crc kubenswrapper[4878]: I1202 18:34:02.044313 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c"] Dec 02 18:34:02 crc kubenswrapper[4878]: I1202 18:34:02.055392 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9"] Dec 02 18:34:02 crc kubenswrapper[4878]: I1202 18:34:02.064912 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw"] Dec 02 18:34:02 crc kubenswrapper[4878]: I1202 18:34:02.069217 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh"] Dec 02 18:34:02 crc kubenswrapper[4878]: W1202 18:34:02.123793 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a363d4_d9a3_44df_8a8f_902cb14a0443.slice/crio-52e61e8c6cc5c1160d3332a9975fb4b493b7c91e4186bd6d619eb1bff5098cea WatchSource:0}: Error finding container 52e61e8c6cc5c1160d3332a9975fb4b493b7c91e4186bd6d619eb1bff5098cea: Status 404 returned error can't find the container with id 52e61e8c6cc5c1160d3332a9975fb4b493b7c91e4186bd6d619eb1bff5098cea Dec 02 18:34:02 crc kubenswrapper[4878]: E1202 18:34:02.129928 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.193:5001/openstack-k8s-operators/telemetry-operator:bfa7e58a045e67b62f0c6dc59ae775e3e34147ba,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hzphd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-546f978c55-mvlfw_openstack-operators(44a363d4-d9a3-44df-8a8f-902cb14a0443): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 18:34:02 crc kubenswrapper[4878]: E1202 18:34:02.132471 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hzphd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-546f978c55-mvlfw_openstack-operators(44a363d4-d9a3-44df-8a8f-902cb14a0443): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 18:34:02 crc kubenswrapper[4878]: E1202 18:34:02.133744 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" podUID="44a363d4-d9a3-44df-8a8f-902cb14a0443" Dec 02 18:34:02 crc kubenswrapper[4878]: I1202 18:34:02.602454 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:02 crc kubenswrapper[4878]: E1202 18:34:02.602940 4878 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 18:34:02 crc kubenswrapper[4878]: E1202 18:34:02.602988 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert podName:3028ad1d-cba5-4197-964f-6405fb1cc1c3 nodeName:}" failed. No retries permitted until 2025-12-02 18:34:06.602971343 +0000 UTC m=+1156.292590224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert") pod "infra-operator-controller-manager-57548d458d-cmzpg" (UID: "3028ad1d-cba5-4197-964f-6405fb1cc1c3") : secret "infra-operator-webhook-server-cert" not found Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.003319 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" event={"ID":"5691d664-31ad-44c8-ab51-11bcf8f9d4c2","Type":"ContainerStarted","Data":"eaba79ba1d87990a03cccd9ecb1df2aa8039b73c135a39eaaef09b8c308263c3"} Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.007095 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" event={"ID":"bfa1dc17-f042-46f8-8bc8-3f8d9e135073","Type":"ContainerStarted","Data":"2fc925a86746984e7c4a4f93cdf0cd38ce99186acf35453d8d297dfb627b53af"} Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.010058 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" event={"ID":"56f1dd36-8cc9-4026-8976-8816940217a4","Type":"ContainerStarted","Data":"8d3bf2466bd9dd664da99fd958f7dbe897202e7995b19e5c40ef2b0aef730075"} Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.015112 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" event={"ID":"44a363d4-d9a3-44df-8a8f-902cb14a0443","Type":"ContainerStarted","Data":"52e61e8c6cc5c1160d3332a9975fb4b493b7c91e4186bd6d619eb1bff5098cea"} Dec 02 18:34:03 crc kubenswrapper[4878]: E1202 18:34:03.020181 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.193:5001/openstack-k8s-operators/telemetry-operator:bfa7e58a045e67b62f0c6dc59ae775e3e34147ba\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" podUID="44a363d4-d9a3-44df-8a8f-902cb14a0443" Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.025223 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" event={"ID":"98571e6d-dae6-4e83-8d08-e44e8609188f","Type":"ContainerStarted","Data":"3d4813fb759c8e13e934535a78a8de0812aaa405a1e631272915835027a432eb"} Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.119809 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:03 crc kubenswrapper[4878]: E1202 18:34:03.120251 4878 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:34:03 crc kubenswrapper[4878]: E1202 18:34:03.120324 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert podName:095b515f-0144-4f7b-b3ab-9ca3440921db nodeName:}" failed. No retries permitted until 2025-12-02 18:34:07.120299541 +0000 UTC m=+1156.809918422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" (UID: "095b515f-0144-4f7b-b3ab-9ca3440921db") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.477679 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:03 crc kubenswrapper[4878]: I1202 18:34:03.477768 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:03 crc kubenswrapper[4878]: E1202 18:34:03.477914 4878 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 18:34:03 crc kubenswrapper[4878]: E1202 18:34:03.477958 4878 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 18:34:03 crc kubenswrapper[4878]: E1202 18:34:03.478042 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:07.478013367 +0000 UTC m=+1157.167632248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "metrics-server-cert" not found Dec 02 18:34:03 crc kubenswrapper[4878]: E1202 18:34:03.478065 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:07.478057529 +0000 UTC m=+1157.167676410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "webhook-server-cert" not found Dec 02 18:34:04 crc kubenswrapper[4878]: E1202 18:34:04.106207 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.193:5001/openstack-k8s-operators/telemetry-operator:bfa7e58a045e67b62f0c6dc59ae775e3e34147ba\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" podUID="44a363d4-d9a3-44df-8a8f-902cb14a0443" Dec 02 18:34:06 crc kubenswrapper[4878]: I1202 18:34:06.691451 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:06 crc kubenswrapper[4878]: E1202 18:34:06.691520 4878 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 18:34:06 crc kubenswrapper[4878]: E1202 18:34:06.692409 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert podName:3028ad1d-cba5-4197-964f-6405fb1cc1c3 nodeName:}" failed. No retries permitted until 2025-12-02 18:34:14.692364386 +0000 UTC m=+1164.381983267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert") pod "infra-operator-controller-manager-57548d458d-cmzpg" (UID: "3028ad1d-cba5-4197-964f-6405fb1cc1c3") : secret "infra-operator-webhook-server-cert" not found Dec 02 18:34:07 crc kubenswrapper[4878]: I1202 18:34:07.201884 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:07 crc kubenswrapper[4878]: E1202 18:34:07.202167 4878 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:34:07 crc kubenswrapper[4878]: E1202 18:34:07.202304 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert podName:095b515f-0144-4f7b-b3ab-9ca3440921db nodeName:}" failed. No retries permitted until 2025-12-02 18:34:15.202276393 +0000 UTC m=+1164.891895284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" (UID: "095b515f-0144-4f7b-b3ab-9ca3440921db") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 18:34:07 crc kubenswrapper[4878]: I1202 18:34:07.515022 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:07 crc kubenswrapper[4878]: I1202 18:34:07.515135 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:07 crc kubenswrapper[4878]: E1202 18:34:07.515247 4878 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 18:34:07 crc kubenswrapper[4878]: E1202 18:34:07.515324 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:15.515308603 +0000 UTC m=+1165.204927484 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "metrics-server-cert" not found Dec 02 18:34:07 crc kubenswrapper[4878]: E1202 18:34:07.515423 4878 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 18:34:07 crc kubenswrapper[4878]: E1202 18:34:07.515525 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs podName:e18472ba-dc06-4e34-99a7-974d9af72c0a nodeName:}" failed. No retries permitted until 2025-12-02 18:34:15.515502199 +0000 UTC m=+1165.205121080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs") pod "openstack-operator-controller-manager-bb64db99c-xtzmk" (UID: "e18472ba-dc06-4e34-99a7-974d9af72c0a") : secret "webhook-server-cert" not found Dec 02 18:34:14 crc kubenswrapper[4878]: I1202 18:34:14.789947 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:14 crc kubenswrapper[4878]: I1202 18:34:14.797507 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ad1d-cba5-4197-964f-6405fb1cc1c3-cert\") pod \"infra-operator-controller-manager-57548d458d-cmzpg\" (UID: \"3028ad1d-cba5-4197-964f-6405fb1cc1c3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.082556 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.302851 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.306976 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/095b515f-0144-4f7b-b3ab-9ca3440921db-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk\" (UID: \"095b515f-0144-4f7b-b3ab-9ca3440921db\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.550014 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.606480 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.606596 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.612560 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-webhook-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.613602 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e18472ba-dc06-4e34-99a7-974d9af72c0a-metrics-certs\") pod \"openstack-operator-controller-manager-bb64db99c-xtzmk\" (UID: \"e18472ba-dc06-4e34-99a7-974d9af72c0a\") " pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:15 crc kubenswrapper[4878]: I1202 18:34:15.747321 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:19 crc kubenswrapper[4878]: E1202 18:34:19.739003 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 02 18:34:19 crc kubenswrapper[4878]: E1202 18:34:19.739711 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gwp24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-rqzqn_openstack-operators(7dfef70a-0da9-4ad6-9fda-1cac674c9ddb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:24 crc kubenswrapper[4878]: E1202 18:34:24.832302 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 02 18:34:24 crc kubenswrapper[4878]: E1202 18:34:24.833228 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hn7jh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-6qznx_openstack-operators(98571e6d-dae6-4e83-8d08-e44e8609188f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:25 crc kubenswrapper[4878]: E1202 18:34:25.442092 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 02 18:34:25 crc kubenswrapper[4878]: E1202 18:34:25.442369 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zdnmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-f58x6_openstack-operators(ed599489-c1f6-440f-aaf3-339f424cbcdf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:25 crc kubenswrapper[4878]: E1202 18:34:25.905959 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 02 18:34:25 crc kubenswrapper[4878]: E1202 18:34:25.906450 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mmsh4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-j4dz5_openstack-operators(5a683e44-012a-41ec-98db-36bcd5646959): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:26 crc kubenswrapper[4878]: E1202 18:34:26.338632 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 02 18:34:26 crc kubenswrapper[4878]: E1202 18:34:26.338857 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ng5rx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-4gsxd_openstack-operators(fc38a188-1850-41eb-a958-fd1fe01270c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:29 crc kubenswrapper[4878]: E1202 18:34:29.033896 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 02 18:34:29 crc kubenswrapper[4878]: E1202 18:34:29.034566 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sdxf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-vc67f_openstack-operators(d7143587-e348-48b5-9164-a4d477b4a259): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:29 crc kubenswrapper[4878]: E1202 18:34:29.609553 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 02 18:34:29 crc kubenswrapper[4878]: E1202 18:34:29.610142 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jj8pf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-952z9_openstack-operators(492dff42-bb87-4c30-8f81-02406308904c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:34 crc kubenswrapper[4878]: E1202 18:34:34.462532 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 02 18:34:34 crc kubenswrapper[4878]: E1202 18:34:34.463688 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qhfkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-fx64f_openstack-operators(a128e6b1-604f-4d2d-9b31-1567ade115df): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:35 crc kubenswrapper[4878]: E1202 18:34:35.104977 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 02 18:34:35 crc kubenswrapper[4878]: E1202 18:34:35.105221 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2vzx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-kmkzd_openstack-operators(e51c89b2-70ae-4c1d-81b0-8aba6e211dd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:36 crc kubenswrapper[4878]: E1202 18:34:36.082711 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 02 18:34:36 crc kubenswrapper[4878]: E1202 18:34:36.083231 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w4rt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-24n6c_openstack-operators(5691d664-31ad-44c8-ab51-11bcf8f9d4c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:36 crc kubenswrapper[4878]: E1202 18:34:36.084430 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" podUID="5691d664-31ad-44c8-ab51-11bcf8f9d4c2" Dec 02 18:34:36 crc kubenswrapper[4878]: E1202 18:34:36.465575 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" podUID="5691d664-31ad-44c8-ab51-11bcf8f9d4c2" Dec 02 18:34:37 crc kubenswrapper[4878]: E1202 18:34:37.059381 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 02 18:34:37 crc kubenswrapper[4878]: E1202 18:34:37.060207 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qjkzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-xzpbx_openstack-operators(84d9833f-9760-47ea-ba43-f385b24a3e57): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:34:37 crc kubenswrapper[4878]: I1202 18:34:37.605625 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk"] Dec 02 18:34:38 crc kubenswrapper[4878]: I1202 18:34:38.123905 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk"] Dec 02 18:34:38 crc kubenswrapper[4878]: I1202 18:34:38.170221 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg"] Dec 02 18:34:38 crc kubenswrapper[4878]: W1202 18:34:38.242033 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode18472ba_dc06_4e34_99a7_974d9af72c0a.slice/crio-c6d6a439c3c73329ab13b7d5da6650c1225b4c35989fce62572900a32a8adea3 WatchSource:0}: Error finding container c6d6a439c3c73329ab13b7d5da6650c1225b4c35989fce62572900a32a8adea3: Status 404 returned error can't find the container with id c6d6a439c3c73329ab13b7d5da6650c1225b4c35989fce62572900a32a8adea3 Dec 02 18:34:38 crc kubenswrapper[4878]: W1202 18:34:38.247378 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3028ad1d_cba5_4197_964f_6405fb1cc1c3.slice/crio-ac60a0c80c8eeefb9c3867db7268ac1a8b6831f2450931214501568c7689d19d WatchSource:0}: Error finding container ac60a0c80c8eeefb9c3867db7268ac1a8b6831f2450931214501568c7689d19d: Status 404 returned error can't find the container with id ac60a0c80c8eeefb9c3867db7268ac1a8b6831f2450931214501568c7689d19d Dec 02 18:34:38 crc kubenswrapper[4878]: I1202 18:34:38.481034 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" event={"ID":"095b515f-0144-4f7b-b3ab-9ca3440921db","Type":"ContainerStarted","Data":"7bd52551aec0b73c629eaa03240c82b6028d16265b7049dec5a05a4911f365b7"} Dec 02 18:34:38 crc kubenswrapper[4878]: I1202 18:34:38.532224 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" event={"ID":"56f1dd36-8cc9-4026-8976-8816940217a4","Type":"ContainerStarted","Data":"86a913208253da3a3544b08b8df74ebab7d6acaaafe8b70164501860a1f05377"} Dec 02 18:34:38 crc kubenswrapper[4878]: I1202 18:34:38.534850 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" event={"ID":"e18472ba-dc06-4e34-99a7-974d9af72c0a","Type":"ContainerStarted","Data":"c6d6a439c3c73329ab13b7d5da6650c1225b4c35989fce62572900a32a8adea3"} Dec 02 18:34:38 crc kubenswrapper[4878]: I1202 18:34:38.536945 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" event={"ID":"3028ad1d-cba5-4197-964f-6405fb1cc1c3","Type":"ContainerStarted","Data":"ac60a0c80c8eeefb9c3867db7268ac1a8b6831f2450931214501568c7689d19d"} Dec 02 18:34:39 crc kubenswrapper[4878]: I1202 18:34:39.548575 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" event={"ID":"5cee25c6-1e94-400c-afd8-c1e75f31e619","Type":"ContainerStarted","Data":"8cf35e3cea9ba05deb847ad795c1634ef7e9138e613d5644bc05f548d7349737"} Dec 02 18:34:39 crc kubenswrapper[4878]: I1202 18:34:39.551053 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" event={"ID":"572eebdd-2dc7-4327-a339-4f92e3971d59","Type":"ContainerStarted","Data":"12dd5a058ca57a8bc4fae4a10c30e77b8ac8ebdf76cdcf196f1a9e3e14cb231b"} Dec 02 18:34:39 crc kubenswrapper[4878]: I1202 18:34:39.553021 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" event={"ID":"46cfeb81-3e48-4f16-ae55-aabe49810afb","Type":"ContainerStarted","Data":"0ffe78f26066a734dfb93b03de12a2e83fbf6b80dac299e9adb0ff220fb7cb07"} Dec 02 18:34:41 crc kubenswrapper[4878]: I1202 18:34:41.576041 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" event={"ID":"81eba8a0-84f6-4456-9484-dfa84dda8e10","Type":"ContainerStarted","Data":"9b4f25c36747b0af231bba399432dfd2c8cf9d6b71709a44c64b6aa063ded417"} Dec 02 18:34:42 crc kubenswrapper[4878]: I1202 18:34:42.592180 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" event={"ID":"77e2f2de-8d3f-437b-8f32-7b76ea70ccda","Type":"ContainerStarted","Data":"230c1137a9bcbd1f1a5d931c0619187ebf1b73abfb6c51081ba80336a1fdc08b"} Dec 02 18:34:42 crc kubenswrapper[4878]: I1202 18:34:42.595949 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" event={"ID":"bfa1dc17-f042-46f8-8bc8-3f8d9e135073","Type":"ContainerStarted","Data":"802dd44c2b04471137fbf9cf2e72e3eb34edcaf9c134dfd86c0156b0cb08e469"} Dec 02 18:34:42 crc kubenswrapper[4878]: I1202 18:34:42.598051 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" event={"ID":"e18472ba-dc06-4e34-99a7-974d9af72c0a","Type":"ContainerStarted","Data":"5f1cb87293009ae69b04e9c86988f9debe1461ec2b151cb34b9dedd5b4d5793f"} Dec 02 18:34:42 crc kubenswrapper[4878]: I1202 18:34:42.598215 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:42 crc kubenswrapper[4878]: I1202 18:34:42.601768 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" event={"ID":"0e516f0b-2b62-4d60-b1bd-07404ffcdea9","Type":"ContainerStarted","Data":"a50529b684a891cf124fe625c399a4704e6a8723c6c4f07c6e0c8879a42a08f4"} Dec 02 18:34:46 crc kubenswrapper[4878]: I1202 18:34:46.654391 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" event={"ID":"44a363d4-d9a3-44df-8a8f-902cb14a0443","Type":"ContainerStarted","Data":"809d20d2e03bd50975703910e7384a12fde6a3e761f4b78cd881b2422ccbea46"} Dec 02 18:34:47 crc kubenswrapper[4878]: E1202 18:34:47.560097 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" podUID="a128e6b1-604f-4d2d-9b31-1567ade115df" Dec 02 18:34:47 crc kubenswrapper[4878]: E1202 18:34:47.679913 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" podUID="98571e6d-dae6-4e83-8d08-e44e8609188f" Dec 02 18:34:47 crc kubenswrapper[4878]: I1202 18:34:47.681582 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" event={"ID":"095b515f-0144-4f7b-b3ab-9ca3440921db","Type":"ContainerStarted","Data":"0df199d038fb3ae447417fa7f50ae234f54845575c9336b84c008506b535830a"} Dec 02 18:34:47 crc kubenswrapper[4878]: E1202 18:34:47.681701 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" podUID="fc38a188-1850-41eb-a958-fd1fe01270c7" Dec 02 18:34:47 crc kubenswrapper[4878]: I1202 18:34:47.687154 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" event={"ID":"a128e6b1-604f-4d2d-9b31-1567ade115df","Type":"ContainerStarted","Data":"23345028d377c0bfa2177d1afd8dc08eed6cde2b5dba1fa21f3c3ff620252852"} Dec 02 18:34:47 crc kubenswrapper[4878]: E1202 18:34:47.694665 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" podUID="e51c89b2-70ae-4c1d-81b0-8aba6e211dd0" Dec 02 18:34:47 crc kubenswrapper[4878]: I1202 18:34:47.707891 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" event={"ID":"3028ad1d-cba5-4197-964f-6405fb1cc1c3","Type":"ContainerStarted","Data":"d92b4d46d5eecd784cf590a4ce5090ed6d0ce1f27fe8308e1f77d936cb02851b"} Dec 02 18:34:47 crc kubenswrapper[4878]: I1202 18:34:47.716658 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" podStartSLOduration=48.716629791 podStartE2EDuration="48.716629791s" podCreationTimestamp="2025-12-02 18:33:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:34:42.629590663 +0000 UTC m=+1192.319209544" watchObservedRunningTime="2025-12-02 18:34:47.716629791 +0000 UTC m=+1197.406248672" Dec 02 18:34:47 crc kubenswrapper[4878]: E1202 18:34:47.771009 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" podUID="7dfef70a-0da9-4ad6-9fda-1cac674c9ddb" Dec 02 18:34:47 crc kubenswrapper[4878]: E1202 18:34:47.848266 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" podUID="ed599489-c1f6-440f-aaf3-339f424cbcdf" Dec 02 18:34:47 crc kubenswrapper[4878]: E1202 18:34:47.893871 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" podUID="84d9833f-9760-47ea-ba43-f385b24a3e57" Dec 02 18:34:48 crc kubenswrapper[4878]: E1202 18:34:48.060423 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" podUID="d7143587-e348-48b5-9164-a4d477b4a259" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.716859 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" event={"ID":"46cfeb81-3e48-4f16-ae55-aabe49810afb","Type":"ContainerStarted","Data":"81ed90f85cb2a1ec2ef932303f7f3b9e266c929976872eeb88545c0750945b7f"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.718671 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.720157 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" event={"ID":"98571e6d-dae6-4e83-8d08-e44e8609188f","Type":"ContainerStarted","Data":"cc9fe5711d1d119b9c3c2d6020b06b509d28887e068abd245b97c16f2a21adae"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.721870 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" event={"ID":"ed599489-c1f6-440f-aaf3-339f424cbcdf","Type":"ContainerStarted","Data":"05ead65955ed868b4a01272364636497474ac1d5736f7daea221f451df0c151b"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.722691 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.725378 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" event={"ID":"e51c89b2-70ae-4c1d-81b0-8aba6e211dd0","Type":"ContainerStarted","Data":"e12edcf263c7ba06fc2a09539d190c92b5be2e02adf9e21d791f0ceb90ab2e15"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.728529 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" event={"ID":"d7143587-e348-48b5-9164-a4d477b4a259","Type":"ContainerStarted","Data":"056620a7c0bf67b2ab9ae26af31fa0931ad577b518b1dfc976e44dd63d9ee5a9"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.732559 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" event={"ID":"572eebdd-2dc7-4327-a339-4f92e3971d59","Type":"ContainerStarted","Data":"8281c568b3272c2b8b6ad226ceab7b15ce3831e000e0ec81ac0e5fedb6308bae"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.732798 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.736855 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.737545 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" event={"ID":"fc38a188-1850-41eb-a958-fd1fe01270c7","Type":"ContainerStarted","Data":"ae0e1ecc767a71a583072051778a5ba71d303bd7611133c01963eba16693cd2d"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.742799 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9stj8" podStartSLOduration=5.470510963 podStartE2EDuration="50.742782351s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.711112858 +0000 UTC m=+1151.400731739" lastFinishedPulling="2025-12-02 18:34:46.983384206 +0000 UTC m=+1196.673003127" observedRunningTime="2025-12-02 18:34:48.737123916 +0000 UTC m=+1198.426742797" watchObservedRunningTime="2025-12-02 18:34:48.742782351 +0000 UTC m=+1198.432401232" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.744507 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" event={"ID":"77e2f2de-8d3f-437b-8f32-7b76ea70ccda","Type":"ContainerStarted","Data":"803192b6f6c6ce78ff0b6104e4cc565ddb59cda3420eac7ea405a5b5333ba458"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.744541 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.747416 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.748787 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" event={"ID":"84d9833f-9760-47ea-ba43-f385b24a3e57","Type":"ContainerStarted","Data":"f41750da0491265d964e99b4fa779ea22999155a304e9cde06aefcc51103fe75"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.753023 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" event={"ID":"7dfef70a-0da9-4ad6-9fda-1cac674c9ddb","Type":"ContainerStarted","Data":"55c93de6b77575f29f2aa7882f0f7fc7f90e9e27fce8ddb36a71be9ab8573cc0"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.762787 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" event={"ID":"5cee25c6-1e94-400c-afd8-c1e75f31e619","Type":"ContainerStarted","Data":"f23fa52b64c569200d3130dd68febeea9a12dd14ead382d4618455fdfc0d03b1"} Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.763952 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.774706 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.784544 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fhrfj" podStartSLOduration=5.568140283 podStartE2EDuration="50.784520617s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.739185069 +0000 UTC m=+1151.428803950" lastFinishedPulling="2025-12-02 18:34:46.955565373 +0000 UTC m=+1196.645184284" observedRunningTime="2025-12-02 18:34:48.774809325 +0000 UTC m=+1198.464428206" watchObservedRunningTime="2025-12-02 18:34:48.784520617 +0000 UTC m=+1198.474139498" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.914616 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-kwqck" podStartSLOduration=4.242633065 podStartE2EDuration="50.914588461s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:00.316197488 +0000 UTC m=+1150.005816369" lastFinishedPulling="2025-12-02 18:34:46.988152874 +0000 UTC m=+1196.677771765" observedRunningTime="2025-12-02 18:34:48.874123907 +0000 UTC m=+1198.563742788" watchObservedRunningTime="2025-12-02 18:34:48.914588461 +0000 UTC m=+1198.604207332" Dec 02 18:34:48 crc kubenswrapper[4878]: I1202 18:34:48.956703 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-4c7fq" podStartSLOduration=5.327846979 podStartE2EDuration="50.956682847s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.320368558 +0000 UTC m=+1151.009987439" lastFinishedPulling="2025-12-02 18:34:46.949204426 +0000 UTC m=+1196.638823307" observedRunningTime="2025-12-02 18:34:48.947331747 +0000 UTC m=+1198.636950648" watchObservedRunningTime="2025-12-02 18:34:48.956682847 +0000 UTC m=+1198.646301728" Dec 02 18:34:49 crc kubenswrapper[4878]: E1202 18:34:49.195116 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" podUID="492dff42-bb87-4c30-8f81-02406308904c" Dec 02 18:34:49 crc kubenswrapper[4878]: E1202 18:34:49.207854 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" podUID="5a683e44-012a-41ec-98db-36bcd5646959" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.798647 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" event={"ID":"492dff42-bb87-4c30-8f81-02406308904c","Type":"ContainerStarted","Data":"23e06723f51b5739aa5bb0b4e2e49802b422b7bca74e6d00dab95256bff6517a"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.831377 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" event={"ID":"bfa1dc17-f042-46f8-8bc8-3f8d9e135073","Type":"ContainerStarted","Data":"9910c7b25928ba6a85a6a1e942ee6161420e819e5fa7378f68cda8aa55bbfbae"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.831752 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.834849 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.846181 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" event={"ID":"095b515f-0144-4f7b-b3ab-9ca3440921db","Type":"ContainerStarted","Data":"792c79b5f8f512a4bb6db21d4b3d7f19dd72d24284086c4caf746b401b3ab382"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.856593 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" event={"ID":"0e516f0b-2b62-4d60-b1bd-07404ffcdea9","Type":"ContainerStarted","Data":"786a2c58e45c6c42279836e6e4221f2c6f0912ec139c0b2a3e7901532a02c00a"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.857379 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.858716 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-hndnh" podStartSLOduration=6.9705079340000005 podStartE2EDuration="51.858703318s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:02.120774776 +0000 UTC m=+1151.810393657" lastFinishedPulling="2025-12-02 18:34:47.00897015 +0000 UTC m=+1196.698589041" observedRunningTime="2025-12-02 18:34:49.847597083 +0000 UTC m=+1199.537215964" watchObservedRunningTime="2025-12-02 18:34:49.858703318 +0000 UTC m=+1199.548322199" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.859279 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" event={"ID":"44a363d4-d9a3-44df-8a8f-902cb14a0443","Type":"ContainerStarted","Data":"4d379d802c6f1db22fffc139e63fe6e8b8cfb56186343ccfff5b6a3b224efe74"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.859456 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.860794 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.881145 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" event={"ID":"56f1dd36-8cc9-4026-8976-8816940217a4","Type":"ContainerStarted","Data":"f289b3618e9d75f31a039d08ea3ca624fa879d13e10764a972f9ce6944bcdef3"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.882289 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.888854 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.913636 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-ds92l" podStartSLOduration=5.066175842 podStartE2EDuration="51.913616901s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:00.163305496 +0000 UTC m=+1149.852924377" lastFinishedPulling="2025-12-02 18:34:47.010746555 +0000 UTC m=+1196.700365436" observedRunningTime="2025-12-02 18:34:49.913223318 +0000 UTC m=+1199.602842199" watchObservedRunningTime="2025-12-02 18:34:49.913616901 +0000 UTC m=+1199.603235782" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.930015 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" event={"ID":"81eba8a0-84f6-4456-9484-dfa84dda8e10","Type":"ContainerStarted","Data":"686b4f556511ae1fa3f72582441fc3051bd26c41c67a81965b87e568ba6f4792"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.931275 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.946936 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.955262 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" podStartSLOduration=15.676635583 podStartE2EDuration="51.955231281s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:02.129798715 +0000 UTC m=+1151.819417586" lastFinishedPulling="2025-12-02 18:34:38.408394403 +0000 UTC m=+1188.098013284" observedRunningTime="2025-12-02 18:34:49.954991294 +0000 UTC m=+1199.644610175" watchObservedRunningTime="2025-12-02 18:34:49.955231281 +0000 UTC m=+1199.644850162" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.985074 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" event={"ID":"fc38a188-1850-41eb-a958-fd1fe01270c7","Type":"ContainerStarted","Data":"1c9392b96e6940c6b8b6d39243d83025340dd432177ac332bd2091d0b9c27205"} Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.986108 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" Dec 02 18:34:49 crc kubenswrapper[4878]: I1202 18:34:49.991759 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" podStartSLOduration=43.518363413 podStartE2EDuration="51.991739514s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:37.679619886 +0000 UTC m=+1187.369238807" lastFinishedPulling="2025-12-02 18:34:46.152996027 +0000 UTC m=+1195.842614908" observedRunningTime="2025-12-02 18:34:49.985430179 +0000 UTC m=+1199.675049070" watchObservedRunningTime="2025-12-02 18:34:49.991739514 +0000 UTC m=+1199.681358395" Dec 02 18:34:50 crc kubenswrapper[4878]: I1202 18:34:50.016700 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" podStartSLOduration=3.618039801 podStartE2EDuration="52.016683748s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:00.860926656 +0000 UTC m=+1150.550545537" lastFinishedPulling="2025-12-02 18:34:49.259570603 +0000 UTC m=+1198.949189484" observedRunningTime="2025-12-02 18:34:50.011120665 +0000 UTC m=+1199.700739546" watchObservedRunningTime="2025-12-02 18:34:50.016683748 +0000 UTC m=+1199.706302629" Dec 02 18:34:50 crc kubenswrapper[4878]: I1202 18:34:50.030019 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" event={"ID":"5a683e44-012a-41ec-98db-36bcd5646959","Type":"ContainerStarted","Data":"9b9b91b6beebbbc89a0aca633ec38cbd6ece494b39212139a65843c08c3da749"} Dec 02 18:34:50 crc kubenswrapper[4878]: I1202 18:34:50.045129 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" event={"ID":"3028ad1d-cba5-4197-964f-6405fb1cc1c3","Type":"ContainerStarted","Data":"02850efb33896d0bbc20033e8f207bc234a508d16edef0bfbd35c9c542a576e7"} Dec 02 18:34:50 crc kubenswrapper[4878]: I1202 18:34:50.045419 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:50 crc kubenswrapper[4878]: I1202 18:34:50.063564 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rgj5c" podStartSLOduration=5.083429167 podStartE2EDuration="52.063544042s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:33:59.969915417 +0000 UTC m=+1149.659534288" lastFinishedPulling="2025-12-02 18:34:46.950030242 +0000 UTC m=+1196.639649163" observedRunningTime="2025-12-02 18:34:50.062592272 +0000 UTC m=+1199.752211153" watchObservedRunningTime="2025-12-02 18:34:50.063544042 +0000 UTC m=+1199.753162923" Dec 02 18:34:50 crc kubenswrapper[4878]: I1202 18:34:50.146920 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-w99w9" podStartSLOduration=7.281761049 podStartE2EDuration="52.146889867s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:02.083994955 +0000 UTC m=+1151.773613836" lastFinishedPulling="2025-12-02 18:34:46.949123753 +0000 UTC m=+1196.638742654" observedRunningTime="2025-12-02 18:34:50.088379212 +0000 UTC m=+1199.777998093" watchObservedRunningTime="2025-12-02 18:34:50.146889867 +0000 UTC m=+1199.836508738" Dec 02 18:34:50 crc kubenswrapper[4878]: I1202 18:34:50.223167 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" podStartSLOduration=43.556974201 podStartE2EDuration="52.223139153s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:38.249897146 +0000 UTC m=+1187.939516027" lastFinishedPulling="2025-12-02 18:34:46.916062088 +0000 UTC m=+1196.605680979" observedRunningTime="2025-12-02 18:34:50.116813514 +0000 UTC m=+1199.806432395" watchObservedRunningTime="2025-12-02 18:34:50.223139153 +0000 UTC m=+1199.912758034" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.052573 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" event={"ID":"e51c89b2-70ae-4c1d-81b0-8aba6e211dd0","Type":"ContainerStarted","Data":"3dd4f89734414e39904aad4aaeaad872d4e5f28891c2ad200747bb01ea2f371b"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.052681 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.054361 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" event={"ID":"84d9833f-9760-47ea-ba43-f385b24a3e57","Type":"ContainerStarted","Data":"51002ff6066d37311cf7d4c1b5e3ec6750a1f80a7a0994f78a70d86e7432c7e3"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.054485 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.056420 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" event={"ID":"5691d664-31ad-44c8-ab51-11bcf8f9d4c2","Type":"ContainerStarted","Data":"0ef6fecaec957ab35e699f709bbb44977a4b6345b6040d6877b5210addfc4ec8"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.058012 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" event={"ID":"7dfef70a-0da9-4ad6-9fda-1cac674c9ddb","Type":"ContainerStarted","Data":"c0a10f8f79a8e1695b591f5af15a8166e2525a18d625662f8255539a1d235755"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.058889 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.060305 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" event={"ID":"a128e6b1-604f-4d2d-9b31-1567ade115df","Type":"ContainerStarted","Data":"f398fbac39c6bb40560b22bba5e369159044fd2890c7894e6927674a2013899f"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.060765 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.062110 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" event={"ID":"5a683e44-012a-41ec-98db-36bcd5646959","Type":"ContainerStarted","Data":"dff1a47ad8ce2749eae1767a088586719548e964ccf98908a2a6475ca6a67f21"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.062336 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.064282 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" event={"ID":"ed599489-c1f6-440f-aaf3-339f424cbcdf","Type":"ContainerStarted","Data":"e6ee0fb4e60e1c5426f277e8dd9d5e0e5ba893df18268129e93a57f0c37f1453"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.064750 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.071416 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" event={"ID":"d7143587-e348-48b5-9164-a4d477b4a259","Type":"ContainerStarted","Data":"6d1230901a90fa13b173f1b9cedea499fa3a0f577c9c18300feed65ccfbf6b03"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.074828 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" podStartSLOduration=5.086043238 podStartE2EDuration="53.074810161s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.696324209 +0000 UTC m=+1151.385943090" lastFinishedPulling="2025-12-02 18:34:49.685091132 +0000 UTC m=+1199.374710013" observedRunningTime="2025-12-02 18:34:51.074616774 +0000 UTC m=+1200.764235665" watchObservedRunningTime="2025-12-02 18:34:51.074810161 +0000 UTC m=+1200.764429042" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.075919 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" event={"ID":"98571e6d-dae6-4e83-8d08-e44e8609188f","Type":"ContainerStarted","Data":"2a26deca0f87bc24e8ed8a7a2fda508b9ae5764ab6c522b39e2f832e92616f6b"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.076028 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.079186 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" event={"ID":"492dff42-bb87-4c30-8f81-02406308904c","Type":"ContainerStarted","Data":"a9c2f8a39e1c20ff96bee0a6b69844383f20f868b414a286efda4365b4573bf6"} Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.081552 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.084524 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-546f978c55-mvlfw" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.106250 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" podStartSLOduration=4.341251415 podStartE2EDuration="53.106217445s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:00.863598089 +0000 UTC m=+1150.553216970" lastFinishedPulling="2025-12-02 18:34:49.628564109 +0000 UTC m=+1199.318183000" observedRunningTime="2025-12-02 18:34:51.100028233 +0000 UTC m=+1200.789647114" watchObservedRunningTime="2025-12-02 18:34:51.106217445 +0000 UTC m=+1200.795836326" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.129597 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" podStartSLOduration=4.786786636 podStartE2EDuration="53.12957731s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.342263617 +0000 UTC m=+1151.031882518" lastFinishedPulling="2025-12-02 18:34:49.685054311 +0000 UTC m=+1199.374673192" observedRunningTime="2025-12-02 18:34:51.12152436 +0000 UTC m=+1200.811143241" watchObservedRunningTime="2025-12-02 18:34:51.12957731 +0000 UTC m=+1200.819196191" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.160584 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" podStartSLOduration=4.331367559 podStartE2EDuration="53.160566041s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.738985933 +0000 UTC m=+1151.428604814" lastFinishedPulling="2025-12-02 18:34:50.568184415 +0000 UTC m=+1200.257803296" observedRunningTime="2025-12-02 18:34:51.152383258 +0000 UTC m=+1200.842002139" watchObservedRunningTime="2025-12-02 18:34:51.160566041 +0000 UTC m=+1200.850184922" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.174587 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" podStartSLOduration=5.2075865 podStartE2EDuration="53.174566176s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.71791864 +0000 UTC m=+1151.407537521" lastFinishedPulling="2025-12-02 18:34:49.684898316 +0000 UTC m=+1199.374517197" observedRunningTime="2025-12-02 18:34:51.169470258 +0000 UTC m=+1200.859089149" watchObservedRunningTime="2025-12-02 18:34:51.174566176 +0000 UTC m=+1200.864185057" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.219541 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" podStartSLOduration=4.931169035 podStartE2EDuration="53.21952338s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.394858778 +0000 UTC m=+1151.084477659" lastFinishedPulling="2025-12-02 18:34:49.683213123 +0000 UTC m=+1199.372832004" observedRunningTime="2025-12-02 18:34:51.209474028 +0000 UTC m=+1200.899092909" watchObservedRunningTime="2025-12-02 18:34:51.21952338 +0000 UTC m=+1200.909142261" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.233371 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-24n6c" podStartSLOduration=4.668680992 podStartE2EDuration="52.233351489s" podCreationTimestamp="2025-12-02 18:33:59 +0000 UTC" firstStartedPulling="2025-12-02 18:34:02.124296445 +0000 UTC m=+1151.813915326" lastFinishedPulling="2025-12-02 18:34:49.688966932 +0000 UTC m=+1199.378585823" observedRunningTime="2025-12-02 18:34:51.228943922 +0000 UTC m=+1200.918562803" watchObservedRunningTime="2025-12-02 18:34:51.233351489 +0000 UTC m=+1200.922970370" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.263306 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" podStartSLOduration=4.961069771 podStartE2EDuration="53.263286077s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.385226369 +0000 UTC m=+1151.074845260" lastFinishedPulling="2025-12-02 18:34:49.687442685 +0000 UTC m=+1199.377061566" observedRunningTime="2025-12-02 18:34:51.255886038 +0000 UTC m=+1200.945504919" watchObservedRunningTime="2025-12-02 18:34:51.263286077 +0000 UTC m=+1200.952904958" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.279099 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" podStartSLOduration=5.7203612150000005 podStartE2EDuration="53.279078367s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:02.126397221 +0000 UTC m=+1151.816016102" lastFinishedPulling="2025-12-02 18:34:49.685114353 +0000 UTC m=+1199.374733254" observedRunningTime="2025-12-02 18:34:51.277192649 +0000 UTC m=+1200.966811530" watchObservedRunningTime="2025-12-02 18:34:51.279078367 +0000 UTC m=+1200.968697248" Dec 02 18:34:51 crc kubenswrapper[4878]: I1202 18:34:51.314799 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" podStartSLOduration=4.296000762 podStartE2EDuration="53.314784775s" podCreationTimestamp="2025-12-02 18:33:58 +0000 UTC" firstStartedPulling="2025-12-02 18:34:01.388083538 +0000 UTC m=+1151.077702419" lastFinishedPulling="2025-12-02 18:34:50.406867551 +0000 UTC m=+1200.096486432" observedRunningTime="2025-12-02 18:34:51.311374449 +0000 UTC m=+1201.000993320" watchObservedRunningTime="2025-12-02 18:34:51.314784775 +0000 UTC m=+1201.004403656" Dec 02 18:34:52 crc kubenswrapper[4878]: I1202 18:34:52.094298 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" Dec 02 18:34:52 crc kubenswrapper[4878]: I1202 18:34:52.095275 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" Dec 02 18:34:52 crc kubenswrapper[4878]: I1202 18:34:52.106207 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk" Dec 02 18:34:53 crc kubenswrapper[4878]: I1202 18:34:53.742667 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:34:53 crc kubenswrapper[4878]: I1202 18:34:53.743056 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:34:55 crc kubenswrapper[4878]: I1202 18:34:55.090542 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-cmzpg" Dec 02 18:34:55 crc kubenswrapper[4878]: I1202 18:34:55.757592 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-bb64db99c-xtzmk" Dec 02 18:34:58 crc kubenswrapper[4878]: I1202 18:34:58.962599 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-4gsxd" Dec 02 18:34:58 crc kubenswrapper[4878]: I1202 18:34:58.997882 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-fx64f" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.098013 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-952z9" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.098970 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vc67f" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.106797 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-kmkzd" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.293663 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-rqzqn" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.519596 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-f58x6" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.607758 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-xzpbx" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.744488 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-6qznx" Dec 02 18:34:59 crc kubenswrapper[4878]: I1202 18:34:59.765558 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-j4dz5" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.044439 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-k8mqd"] Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.047401 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.052683 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.052763 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-j7p7k" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.052881 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.052884 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.073531 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-k8mqd"] Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.106013 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k9l8c"] Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.109717 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.116029 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.122205 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k9l8c"] Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.122929 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnth\" (UniqueName: \"kubernetes.io/projected/f7553ad2-b8a5-4794-951e-cbb20d1f0426-kube-api-access-tqnth\") pod \"dnsmasq-dns-675f4bcbfc-k8mqd\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.123017 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.123038 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvzs\" (UniqueName: \"kubernetes.io/projected/3eb57016-f541-49f8-92d4-4bc95a8a9396-kube-api-access-vbvzs\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.123081 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-config\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.123122 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7553ad2-b8a5-4794-951e-cbb20d1f0426-config\") pod \"dnsmasq-dns-675f4bcbfc-k8mqd\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.225090 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7553ad2-b8a5-4794-951e-cbb20d1f0426-config\") pod \"dnsmasq-dns-675f4bcbfc-k8mqd\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.225206 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnth\" (UniqueName: \"kubernetes.io/projected/f7553ad2-b8a5-4794-951e-cbb20d1f0426-kube-api-access-tqnth\") pod \"dnsmasq-dns-675f4bcbfc-k8mqd\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.225296 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.225323 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvzs\" (UniqueName: \"kubernetes.io/projected/3eb57016-f541-49f8-92d4-4bc95a8a9396-kube-api-access-vbvzs\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.225380 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-config\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.226489 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7553ad2-b8a5-4794-951e-cbb20d1f0426-config\") pod \"dnsmasq-dns-675f4bcbfc-k8mqd\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.226523 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-config\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.226495 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.246408 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvzs\" (UniqueName: \"kubernetes.io/projected/3eb57016-f541-49f8-92d4-4bc95a8a9396-kube-api-access-vbvzs\") pod \"dnsmasq-dns-78dd6ddcc-k9l8c\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.258727 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnth\" (UniqueName: \"kubernetes.io/projected/f7553ad2-b8a5-4794-951e-cbb20d1f0426-kube-api-access-tqnth\") pod \"dnsmasq-dns-675f4bcbfc-k8mqd\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.373783 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.447049 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:15 crc kubenswrapper[4878]: I1202 18:35:15.921875 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-k8mqd"] Dec 02 18:35:16 crc kubenswrapper[4878]: W1202 18:35:16.081797 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb57016_f541_49f8_92d4_4bc95a8a9396.slice/crio-1795f95a2d887108c9e52034fabf8bd31c8d64b0c99a3f7a25ee0dd06ca653c4 WatchSource:0}: Error finding container 1795f95a2d887108c9e52034fabf8bd31c8d64b0c99a3f7a25ee0dd06ca653c4: Status 404 returned error can't find the container with id 1795f95a2d887108c9e52034fabf8bd31c8d64b0c99a3f7a25ee0dd06ca653c4 Dec 02 18:35:16 crc kubenswrapper[4878]: I1202 18:35:16.081848 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k9l8c"] Dec 02 18:35:16 crc kubenswrapper[4878]: I1202 18:35:16.366543 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" event={"ID":"3eb57016-f541-49f8-92d4-4bc95a8a9396","Type":"ContainerStarted","Data":"1795f95a2d887108c9e52034fabf8bd31c8d64b0c99a3f7a25ee0dd06ca653c4"} Dec 02 18:35:16 crc kubenswrapper[4878]: I1202 18:35:16.368339 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" event={"ID":"f7553ad2-b8a5-4794-951e-cbb20d1f0426","Type":"ContainerStarted","Data":"13d16fc56c352239c23561700689d59aebed8f793cad2359aa718f7440fa946f"} Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.055557 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-k8mqd"] Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.076026 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-54gkd"] Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.082970 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.116184 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-54gkd"] Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.196249 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-config\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.196705 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.196749 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vc8c\" (UniqueName: \"kubernetes.io/projected/63448588-8af2-4397-981d-97ba5bf4170d-kube-api-access-8vc8c\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.301632 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-config\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.301809 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.301876 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vc8c\" (UniqueName: \"kubernetes.io/projected/63448588-8af2-4397-981d-97ba5bf4170d-kube-api-access-8vc8c\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.302705 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-config\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.302744 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.331554 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vc8c\" (UniqueName: \"kubernetes.io/projected/63448588-8af2-4397-981d-97ba5bf4170d-kube-api-access-8vc8c\") pod \"dnsmasq-dns-5ccc8479f9-54gkd\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.449625 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.492213 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k9l8c"] Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.536007 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j459b"] Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.537717 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.561332 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j459b"] Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.711021 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-config\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.711113 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfjcl\" (UniqueName: \"kubernetes.io/projected/3aab598f-4e89-4790-a425-ab7983be07c4-kube-api-access-bfjcl\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.711162 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.812965 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-config\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.813130 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfjcl\" (UniqueName: \"kubernetes.io/projected/3aab598f-4e89-4790-a425-ab7983be07c4-kube-api-access-bfjcl\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.813218 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.814373 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-config\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.814415 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.840966 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfjcl\" (UniqueName: \"kubernetes.io/projected/3aab598f-4e89-4790-a425-ab7983be07c4-kube-api-access-bfjcl\") pod \"dnsmasq-dns-57d769cc4f-j459b\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:18 crc kubenswrapper[4878]: I1202 18:35:18.943156 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.078393 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-54gkd"] Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.256123 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.257965 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.263222 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.263470 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.263672 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lrgf5" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.263854 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.264025 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.264751 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.265139 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.275078 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437568 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437637 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437663 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/168172d2-5cc8-492f-aa26-bd2a1351cdf2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437679 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437698 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6xb\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-kube-api-access-zb6xb\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437728 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437752 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437799 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437820 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/168172d2-5cc8-492f-aa26-bd2a1351cdf2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437891 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.437908 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.459641 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" event={"ID":"63448588-8af2-4397-981d-97ba5bf4170d","Type":"ContainerStarted","Data":"3aff425cd3d4d897d7d028893fb97040958e8426df2ab532e5d6bf1f1b6a23a8"} Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.539652 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.539725 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/168172d2-5cc8-492f-aa26-bd2a1351cdf2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.539820 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.539844 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.539902 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.539936 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.539985 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/168172d2-5cc8-492f-aa26-bd2a1351cdf2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.540006 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.540024 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6xb\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-kube-api-access-zb6xb\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.540045 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.540078 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.542341 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.542820 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.542983 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.543311 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.544432 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.544578 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.547351 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.547378 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.548588 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/168172d2-5cc8-492f-aa26-bd2a1351cdf2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.569200 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.612968 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/168172d2-5cc8-492f-aa26-bd2a1351cdf2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.622221 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6xb\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-kube-api-access-zb6xb\") pod \"rabbitmq-cell1-server-0\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.652493 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j459b"] Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.712270 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.714551 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.728464 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.728670 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.728795 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.728929 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jwm5r" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.729050 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.729154 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.729802 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.742666 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.847737 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.847812 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.847969 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848023 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848073 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ce834c-073d-4062-b3ee-488fa79aae4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848173 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848205 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b47q\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-kube-api-access-5b47q\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848231 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848288 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ce834c-073d-4062-b3ee-488fa79aae4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848312 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.848361 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.895803 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954516 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b47q\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-kube-api-access-5b47q\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954611 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954640 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ce834c-073d-4062-b3ee-488fa79aae4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954680 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954708 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954762 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954785 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954872 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954911 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.954937 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ce834c-073d-4062-b3ee-488fa79aae4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.964946 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.967712 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.968029 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.969753 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.970284 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.970740 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.979007 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ce834c-073d-4062-b3ee-488fa79aae4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.987204 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.987874 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:19 crc kubenswrapper[4878]: I1202 18:35:19.997835 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ce834c-073d-4062-b3ee-488fa79aae4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:20 crc kubenswrapper[4878]: I1202 18:35:20.006362 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b47q\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-kube-api-access-5b47q\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:20 crc kubenswrapper[4878]: I1202 18:35:20.162164 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " pod="openstack/rabbitmq-server-0" Dec 02 18:35:20 crc kubenswrapper[4878]: I1202 18:35:20.367346 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 18:35:20 crc kubenswrapper[4878]: I1202 18:35:20.517649 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" event={"ID":"3aab598f-4e89-4790-a425-ab7983be07c4","Type":"ContainerStarted","Data":"055c534b6dfd85cdd507b06e1da65688365dae2dbac6c54820c1babfc239f8cd"} Dec 02 18:35:20 crc kubenswrapper[4878]: I1202 18:35:20.702480 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:35:20 crc kubenswrapper[4878]: W1202 18:35:20.780122 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168172d2_5cc8_492f_aa26_bd2a1351cdf2.slice/crio-faa17f71d92af21878e9be5bce43baf7e7eba2dee4260cb098a2db0d271c5caf WatchSource:0}: Error finding container faa17f71d92af21878e9be5bce43baf7e7eba2dee4260cb098a2db0d271c5caf: Status 404 returned error can't find the container with id faa17f71d92af21878e9be5bce43baf7e7eba2dee4260cb098a2db0d271c5caf Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.095387 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.161604 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.163454 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.167192 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.168292 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.168504 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.168663 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.172078 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pqbpn" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.172834 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.320499 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.320655 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c436c198-1049-416f-9ab7-33261ff55ab4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.320700 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxc6w\" (UniqueName: \"kubernetes.io/projected/c436c198-1049-416f-9ab7-33261ff55ab4-kube-api-access-dxc6w\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.320739 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-kolla-config\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.320774 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.320796 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-config-data-default\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.321037 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c436c198-1049-416f-9ab7-33261ff55ab4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.321112 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c436c198-1049-416f-9ab7-33261ff55ab4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423538 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c436c198-1049-416f-9ab7-33261ff55ab4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423616 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxc6w\" (UniqueName: \"kubernetes.io/projected/c436c198-1049-416f-9ab7-33261ff55ab4-kube-api-access-dxc6w\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423670 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-kolla-config\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423710 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423739 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-config-data-default\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423776 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c436c198-1049-416f-9ab7-33261ff55ab4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423801 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c436c198-1049-416f-9ab7-33261ff55ab4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.423841 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.424288 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.426719 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-kolla-config\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.426760 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-config-data-default\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.428365 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c436c198-1049-416f-9ab7-33261ff55ab4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.429044 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c436c198-1049-416f-9ab7-33261ff55ab4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.430959 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c436c198-1049-416f-9ab7-33261ff55ab4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.440941 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c436c198-1049-416f-9ab7-33261ff55ab4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.445846 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxc6w\" (UniqueName: \"kubernetes.io/projected/c436c198-1049-416f-9ab7-33261ff55ab4-kube-api-access-dxc6w\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.454509 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c436c198-1049-416f-9ab7-33261ff55ab4\") " pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.495231 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.544313 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8ce834c-073d-4062-b3ee-488fa79aae4f","Type":"ContainerStarted","Data":"c2c233dc442dc246c673af6338f994d8785c44fdaa7da2cbc5b33b1babdd85d6"} Dec 02 18:35:21 crc kubenswrapper[4878]: I1202 18:35:21.551405 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"168172d2-5cc8-492f-aa26-bd2a1351cdf2","Type":"ContainerStarted","Data":"faa17f71d92af21878e9be5bce43baf7e7eba2dee4260cb098a2db0d271c5caf"} Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.179773 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 18:35:22 crc kubenswrapper[4878]: W1202 18:35:22.200849 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc436c198_1049_416f_9ab7_33261ff55ab4.slice/crio-643b771d01c7e12d4fe124539c499cb2c046d78560af2b7fdab718346b87385e WatchSource:0}: Error finding container 643b771d01c7e12d4fe124539c499cb2c046d78560af2b7fdab718346b87385e: Status 404 returned error can't find the container with id 643b771d01c7e12d4fe124539c499cb2c046d78560af2b7fdab718346b87385e Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.240304 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.242141 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.245887 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.246122 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-r6dkn" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.248489 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.249008 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.258923 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.351934 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9cae31f5-acb4-423b-8a14-4136afb73062-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.352000 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cae31f5-acb4-423b-8a14-4136afb73062-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.352028 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ng8\" (UniqueName: \"kubernetes.io/projected/9cae31f5-acb4-423b-8a14-4136afb73062-kube-api-access-x5ng8\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.352108 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.352157 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cae31f5-acb4-423b-8a14-4136afb73062-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.352204 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.352260 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.352295 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.455037 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9cae31f5-acb4-423b-8a14-4136afb73062-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.455093 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cae31f5-acb4-423b-8a14-4136afb73062-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.455108 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ng8\" (UniqueName: \"kubernetes.io/projected/9cae31f5-acb4-423b-8a14-4136afb73062-kube-api-access-x5ng8\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.455141 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.455182 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cae31f5-acb4-423b-8a14-4136afb73062-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.455460 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.455518 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.456137 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9cae31f5-acb4-423b-8a14-4136afb73062-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.456683 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.456849 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.456780 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.457121 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.457913 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cae31f5-acb4-423b-8a14-4136afb73062-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.463155 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cae31f5-acb4-423b-8a14-4136afb73062-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.477008 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ng8\" (UniqueName: \"kubernetes.io/projected/9cae31f5-acb4-423b-8a14-4136afb73062-kube-api-access-x5ng8\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.477253 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cae31f5-acb4-423b-8a14-4136afb73062-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.513853 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9cae31f5-acb4-423b-8a14-4136afb73062\") " pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.587293 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.601960 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c436c198-1049-416f-9ab7-33261ff55ab4","Type":"ContainerStarted","Data":"643b771d01c7e12d4fe124539c499cb2c046d78560af2b7fdab718346b87385e"} Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.824832 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.830924 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.844173 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.844595 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4z577" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.844179 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 18:35:22 crc kubenswrapper[4878]: I1202 18:35:22.932435 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.084957 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/427bac6f-5bf8-4f40-a0f6-fea0cede315f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.085259 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/427bac6f-5bf8-4f40-a0f6-fea0cede315f-config-data\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.085307 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/427bac6f-5bf8-4f40-a0f6-fea0cede315f-kolla-config\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.086332 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427bac6f-5bf8-4f40-a0f6-fea0cede315f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.086543 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h297c\" (UniqueName: \"kubernetes.io/projected/427bac6f-5bf8-4f40-a0f6-fea0cede315f-kube-api-access-h297c\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.197536 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427bac6f-5bf8-4f40-a0f6-fea0cede315f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.197905 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h297c\" (UniqueName: \"kubernetes.io/projected/427bac6f-5bf8-4f40-a0f6-fea0cede315f-kube-api-access-h297c\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.197938 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/427bac6f-5bf8-4f40-a0f6-fea0cede315f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.197987 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/427bac6f-5bf8-4f40-a0f6-fea0cede315f-config-data\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.198009 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/427bac6f-5bf8-4f40-a0f6-fea0cede315f-kolla-config\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.198839 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/427bac6f-5bf8-4f40-a0f6-fea0cede315f-kolla-config\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.203009 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/427bac6f-5bf8-4f40-a0f6-fea0cede315f-config-data\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.213678 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427bac6f-5bf8-4f40-a0f6-fea0cede315f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.218015 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/427bac6f-5bf8-4f40-a0f6-fea0cede315f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.247942 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h297c\" (UniqueName: \"kubernetes.io/projected/427bac6f-5bf8-4f40-a0f6-fea0cede315f-kube-api-access-h297c\") pod \"memcached-0\" (UID: \"427bac6f-5bf8-4f40-a0f6-fea0cede315f\") " pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.494202 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.533215 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.619203 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9cae31f5-acb4-423b-8a14-4136afb73062","Type":"ContainerStarted","Data":"254e827f7d1a77da9b2b06320daea021f08c5e3ee1acc6041a1575cc0adc45d3"} Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.743309 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:35:23 crc kubenswrapper[4878]: I1202 18:35:23.743358 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:35:24 crc kubenswrapper[4878]: I1202 18:35:24.248929 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 18:35:24 crc kubenswrapper[4878]: I1202 18:35:24.677842 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"427bac6f-5bf8-4f40-a0f6-fea0cede315f","Type":"ContainerStarted","Data":"da5cfec1ed48a52eae8e2bf395513f580403ac345b6d78aa94f8febdcb4769c1"} Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.145146 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.146488 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.148067 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dndfw" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.192802 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.254089 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hlzc\" (UniqueName: \"kubernetes.io/projected/3bc2339a-313d-485a-b67f-d18b597c36e5-kube-api-access-5hlzc\") pod \"kube-state-metrics-0\" (UID: \"3bc2339a-313d-485a-b67f-d18b597c36e5\") " pod="openstack/kube-state-metrics-0" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.357921 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hlzc\" (UniqueName: \"kubernetes.io/projected/3bc2339a-313d-485a-b67f-d18b597c36e5-kube-api-access-5hlzc\") pod \"kube-state-metrics-0\" (UID: \"3bc2339a-313d-485a-b67f-d18b597c36e5\") " pod="openstack/kube-state-metrics-0" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.416500 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hlzc\" (UniqueName: \"kubernetes.io/projected/3bc2339a-313d-485a-b67f-d18b597c36e5-kube-api-access-5hlzc\") pod \"kube-state-metrics-0\" (UID: \"3bc2339a-313d-485a-b67f-d18b597c36e5\") " pod="openstack/kube-state-metrics-0" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.495087 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.948457 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q"] Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.949837 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.955600 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.955883 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-bmdsg" Dec 02 18:35:25 crc kubenswrapper[4878]: I1202 18:35:25.965417 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q"] Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.090548 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m58q7\" (UniqueName: \"kubernetes.io/projected/528e5b70-2773-48c2-8382-d4e2ec45933d-kube-api-access-m58q7\") pod \"observability-ui-dashboards-7d5fb4cbfb-zhq9q\" (UID: \"528e5b70-2773-48c2-8382-d4e2ec45933d\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.090610 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/528e5b70-2773-48c2-8382-d4e2ec45933d-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zhq9q\" (UID: \"528e5b70-2773-48c2-8382-d4e2ec45933d\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.203751 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m58q7\" (UniqueName: \"kubernetes.io/projected/528e5b70-2773-48c2-8382-d4e2ec45933d-kube-api-access-m58q7\") pod \"observability-ui-dashboards-7d5fb4cbfb-zhq9q\" (UID: \"528e5b70-2773-48c2-8382-d4e2ec45933d\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.204109 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/528e5b70-2773-48c2-8382-d4e2ec45933d-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zhq9q\" (UID: \"528e5b70-2773-48c2-8382-d4e2ec45933d\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:26 crc kubenswrapper[4878]: E1202 18:35:26.204334 4878 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Dec 02 18:35:26 crc kubenswrapper[4878]: E1202 18:35:26.204396 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528e5b70-2773-48c2-8382-d4e2ec45933d-serving-cert podName:528e5b70-2773-48c2-8382-d4e2ec45933d nodeName:}" failed. No retries permitted until 2025-12-02 18:35:26.704378263 +0000 UTC m=+1236.393997144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/528e5b70-2773-48c2-8382-d4e2ec45933d-serving-cert") pod "observability-ui-dashboards-7d5fb4cbfb-zhq9q" (UID: "528e5b70-2773-48c2-8382-d4e2ec45933d") : secret "observability-ui-dashboards" not found Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.274721 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m58q7\" (UniqueName: \"kubernetes.io/projected/528e5b70-2773-48c2-8382-d4e2ec45933d-kube-api-access-m58q7\") pod \"observability-ui-dashboards-7d5fb4cbfb-zhq9q\" (UID: \"528e5b70-2773-48c2-8382-d4e2ec45933d\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.340624 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-57568cf57-qq2hl"] Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.345972 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.371871 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57568cf57-qq2hl"] Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.386644 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.390154 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.402631 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.402843 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.402969 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.403212 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.403434 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zwpt2" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.404314 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.412176 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-console-config\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.412267 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/409d93c8-2950-4081-b5c8-9c3435e28449-console-serving-cert\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.412292 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-trusted-ca-bundle\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.412325 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cvjs\" (UniqueName: \"kubernetes.io/projected/409d93c8-2950-4081-b5c8-9c3435e28449-kube-api-access-7cvjs\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.412415 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/409d93c8-2950-4081-b5c8-9c3435e28449-console-oauth-config\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.412451 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-service-ca\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.412470 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-oauth-serving-cert\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.418794 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514501 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-config\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514570 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-console-config\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514595 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/409d93c8-2950-4081-b5c8-9c3435e28449-console-serving-cert\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514608 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-trusted-ca-bundle\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514635 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cvjs\" (UniqueName: \"kubernetes.io/projected/409d93c8-2950-4081-b5c8-9c3435e28449-kube-api-access-7cvjs\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514671 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514700 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514729 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80ade876-344b-415c-9609-6477205860c9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514765 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514783 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514820 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/409d93c8-2950-4081-b5c8-9c3435e28449-console-oauth-config\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514855 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-service-ca\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514873 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-oauth-serving-cert\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514918 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz42x\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-kube-api-access-lz42x\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.514935 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ade876-344b-415c-9609-6477205860c9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.516984 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-console-config\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.518601 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-oauth-serving-cert\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.524316 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-trusted-ca-bundle\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.525341 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/409d93c8-2950-4081-b5c8-9c3435e28449-service-ca\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.534465 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/409d93c8-2950-4081-b5c8-9c3435e28449-console-oauth-config\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.552035 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cvjs\" (UniqueName: \"kubernetes.io/projected/409d93c8-2950-4081-b5c8-9c3435e28449-kube-api-access-7cvjs\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.561201 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/409d93c8-2950-4081-b5c8-9c3435e28449-console-serving-cert\") pod \"console-57568cf57-qq2hl\" (UID: \"409d93c8-2950-4081-b5c8-9c3435e28449\") " pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.616647 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz42x\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-kube-api-access-lz42x\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.616711 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ade876-344b-415c-9609-6477205860c9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.616798 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-config\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.616924 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.616958 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.617032 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80ade876-344b-415c-9609-6477205860c9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.617094 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.617115 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.619678 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80ade876-344b-415c-9609-6477205860c9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.619849 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.624985 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-config\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.626765 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.627023 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ade876-344b-415c-9609-6477205860c9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.628830 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.639302 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.641823 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz42x\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-kube-api-access-lz42x\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.684665 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.713926 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.718848 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/528e5b70-2773-48c2-8382-d4e2ec45933d-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zhq9q\" (UID: \"528e5b70-2773-48c2-8382-d4e2ec45933d\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.725354 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/528e5b70-2773-48c2-8382-d4e2ec45933d-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-zhq9q\" (UID: \"528e5b70-2773-48c2-8382-d4e2ec45933d\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.736950 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 18:35:26 crc kubenswrapper[4878]: I1202 18:35:26.911882 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" Dec 02 18:35:28 crc kubenswrapper[4878]: I1202 18:35:28.879035 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qgmlw"] Dec 02 18:35:28 crc kubenswrapper[4878]: I1202 18:35:28.927191 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qgmlw"] Dec 02 18:35:28 crc kubenswrapper[4878]: I1202 18:35:28.927930 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:28 crc kubenswrapper[4878]: I1202 18:35:28.933761 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 18:35:28 crc kubenswrapper[4878]: I1202 18:35:28.936586 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 18:35:28 crc kubenswrapper[4878]: I1202 18:35:28.936663 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rk25l" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.010313 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dsnc6"] Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.013351 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dsnc6"] Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.013458 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086305 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-run-ovn\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086457 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f049ebe-547b-40a2-8468-932cfc5051ea-combined-ca-bundle\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086493 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-run\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086549 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-run\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086601 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jhk\" (UniqueName: \"kubernetes.io/projected/f1f92026-f0b1-470f-885e-914fece7f4e3-kube-api-access-98jhk\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086632 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f92026-f0b1-470f-885e-914fece7f4e3-scripts\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086651 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-etc-ovs\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086672 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f049ebe-547b-40a2-8468-932cfc5051ea-scripts\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086710 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fk5t\" (UniqueName: \"kubernetes.io/projected/1f049ebe-547b-40a2-8468-932cfc5051ea-kube-api-access-7fk5t\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086734 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-log-ovn\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086760 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-lib\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086781 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f049ebe-547b-40a2-8468-932cfc5051ea-ovn-controller-tls-certs\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.086797 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-log\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.189728 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-run-ovn\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.189846 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f049ebe-547b-40a2-8468-932cfc5051ea-combined-ca-bundle\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.189879 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-run\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.189901 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-run\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.189939 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98jhk\" (UniqueName: \"kubernetes.io/projected/f1f92026-f0b1-470f-885e-914fece7f4e3-kube-api-access-98jhk\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.189965 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f92026-f0b1-470f-885e-914fece7f4e3-scripts\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.189987 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-etc-ovs\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190016 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f049ebe-547b-40a2-8468-932cfc5051ea-scripts\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190057 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fk5t\" (UniqueName: \"kubernetes.io/projected/1f049ebe-547b-40a2-8468-932cfc5051ea-kube-api-access-7fk5t\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190106 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-log-ovn\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190148 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-lib\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190180 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f049ebe-547b-40a2-8468-932cfc5051ea-ovn-controller-tls-certs\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190203 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-log\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190708 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-run-ovn\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190778 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-etc-ovs\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190849 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-log\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190894 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-lib\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.190997 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-log-ovn\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.191570 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f1f92026-f0b1-470f-885e-914fece7f4e3-var-run\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.191601 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1f049ebe-547b-40a2-8468-932cfc5051ea-var-run\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.193493 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f049ebe-547b-40a2-8468-932cfc5051ea-scripts\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.193630 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f92026-f0b1-470f-885e-914fece7f4e3-scripts\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.213404 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f049ebe-547b-40a2-8468-932cfc5051ea-combined-ca-bundle\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.213451 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f049ebe-547b-40a2-8468-932cfc5051ea-ovn-controller-tls-certs\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.213975 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fk5t\" (UniqueName: \"kubernetes.io/projected/1f049ebe-547b-40a2-8468-932cfc5051ea-kube-api-access-7fk5t\") pod \"ovn-controller-qgmlw\" (UID: \"1f049ebe-547b-40a2-8468-932cfc5051ea\") " pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.223312 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jhk\" (UniqueName: \"kubernetes.io/projected/f1f92026-f0b1-470f-885e-914fece7f4e3-kube-api-access-98jhk\") pod \"ovn-controller-ovs-dsnc6\" (UID: \"f1f92026-f0b1-470f-885e-914fece7f4e3\") " pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.267107 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qgmlw" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.340378 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.752887 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.755297 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.759555 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.759733 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.761161 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.761437 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-hkgd9" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.761671 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.791922 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.908792 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk9c8\" (UniqueName: \"kubernetes.io/projected/a6aad750-71cc-4815-906a-5f2a130875e8-kube-api-access-rk9c8\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.908850 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.908916 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6aad750-71cc-4815-906a-5f2a130875e8-config\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.908954 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6aad750-71cc-4815-906a-5f2a130875e8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.909853 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.910157 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6aad750-71cc-4815-906a-5f2a130875e8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.910337 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:29 crc kubenswrapper[4878]: I1202 18:35:29.910483 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012456 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6aad750-71cc-4815-906a-5f2a130875e8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012518 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012541 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012578 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk9c8\" (UniqueName: \"kubernetes.io/projected/a6aad750-71cc-4815-906a-5f2a130875e8-kube-api-access-rk9c8\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012598 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012642 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6aad750-71cc-4815-906a-5f2a130875e8-config\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012690 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6aad750-71cc-4815-906a-5f2a130875e8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.012777 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.014691 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.014872 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6aad750-71cc-4815-906a-5f2a130875e8-config\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.014920 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6aad750-71cc-4815-906a-5f2a130875e8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.015735 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6aad750-71cc-4815-906a-5f2a130875e8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.029132 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.029568 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.030172 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6aad750-71cc-4815-906a-5f2a130875e8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.041785 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk9c8\" (UniqueName: \"kubernetes.io/projected/a6aad750-71cc-4815-906a-5f2a130875e8-kube-api-access-rk9c8\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.051069 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6aad750-71cc-4815-906a-5f2a130875e8\") " pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:30 crc kubenswrapper[4878]: I1202 18:35:30.089933 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.456150 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.457964 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.460381 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lnhpf" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.460782 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.460950 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.461861 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.475249 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597312 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597417 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597484 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597515 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597547 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55xk6\" (UniqueName: \"kubernetes.io/projected/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-kube-api-access-55xk6\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597608 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597658 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-config\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.597748 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.699696 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.699750 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-config\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.699838 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.700816 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.700867 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.700996 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.701035 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.701073 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55xk6\" (UniqueName: \"kubernetes.io/projected/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-kube-api-access-55xk6\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.701686 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.701810 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.702877 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-config\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.704131 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.723911 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.729968 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.733901 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.735380 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55xk6\" (UniqueName: \"kubernetes.io/projected/3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4-kube-api-access-55xk6\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.749855 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4\") " pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:32 crc kubenswrapper[4878]: I1202 18:35:32.795467 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 18:35:46 crc kubenswrapper[4878]: I1202 18:35:46.752431 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 18:35:47 crc kubenswrapper[4878]: E1202 18:35:47.241197 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 02 18:35:47 crc kubenswrapper[4878]: E1202 18:35:47.241445 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n57bhb4hf7h6bh74h64h665h9ch5dbhf9h68ch586hfch549hb9hfbhb5h5c4hbbh66ch665h5h54fh567h5d5h5fch67bh6fhc6h599h65fh64dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h297c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(427bac6f-5bf8-4f40-a0f6-fea0cede315f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:35:47 crc kubenswrapper[4878]: E1202 18:35:47.242662 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="427bac6f-5bf8-4f40-a0f6-fea0cede315f" Dec 02 18:35:48 crc kubenswrapper[4878]: E1202 18:35:48.248432 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="427bac6f-5bf8-4f40-a0f6-fea0cede315f" Dec 02 18:35:48 crc kubenswrapper[4878]: W1202 18:35:48.433355 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6aad750_71cc_4815_906a_5f2a130875e8.slice/crio-5539cc23495bb85c9138896835b021f9ad6cdd6c1d50f48b3cafd7c2693b5a67 WatchSource:0}: Error finding container 5539cc23495bb85c9138896835b021f9ad6cdd6c1d50f48b3cafd7c2693b5a67: Status 404 returned error can't find the container with id 5539cc23495bb85c9138896835b021f9ad6cdd6c1d50f48b3cafd7c2693b5a67 Dec 02 18:35:48 crc kubenswrapper[4878]: E1202 18:35:48.575860 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 18:35:48 crc kubenswrapper[4878]: E1202 18:35:48.576437 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vbvzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-k9l8c_openstack(3eb57016-f541-49f8-92d4-4bc95a8a9396): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:35:48 crc kubenswrapper[4878]: E1202 18:35:48.576002 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 18:35:48 crc kubenswrapper[4878]: E1202 18:35:48.576802 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8vc8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-54gkd_openstack(63448588-8af2-4397-981d-97ba5bf4170d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:35:48 crc kubenswrapper[4878]: E1202 18:35:48.579615 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" podUID="63448588-8af2-4397-981d-97ba5bf4170d" Dec 02 18:35:48 crc kubenswrapper[4878]: E1202 18:35:48.579666 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" podUID="3eb57016-f541-49f8-92d4-4bc95a8a9396" Dec 02 18:35:49 crc kubenswrapper[4878]: E1202 18:35:49.098152 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 18:35:49 crc kubenswrapper[4878]: E1202 18:35:49.099056 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqnth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-k8mqd_openstack(f7553ad2-b8a5-4794-951e-cbb20d1f0426): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:35:49 crc kubenswrapper[4878]: I1202 18:35:49.100344 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57568cf57-qq2hl"] Dec 02 18:35:49 crc kubenswrapper[4878]: E1202 18:35:49.100374 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" podUID="f7553ad2-b8a5-4794-951e-cbb20d1f0426" Dec 02 18:35:49 crc kubenswrapper[4878]: I1202 18:35:49.298585 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57568cf57-qq2hl" event={"ID":"409d93c8-2950-4081-b5c8-9c3435e28449","Type":"ContainerStarted","Data":"2469ef007082ff82ac8165b151e166fab284bd82b9173b529fc946cbff08886a"} Dec 02 18:35:49 crc kubenswrapper[4878]: I1202 18:35:49.301883 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a6aad750-71cc-4815-906a-5f2a130875e8","Type":"ContainerStarted","Data":"5539cc23495bb85c9138896835b021f9ad6cdd6c1d50f48b3cafd7c2693b5a67"} Dec 02 18:35:49 crc kubenswrapper[4878]: E1202 18:35:49.305862 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" podUID="63448588-8af2-4397-981d-97ba5bf4170d" Dec 02 18:35:49 crc kubenswrapper[4878]: I1202 18:35:49.810119 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qgmlw"] Dec 02 18:35:49 crc kubenswrapper[4878]: W1202 18:35:49.815992 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc2339a_313d_485a_b67f_d18b597c36e5.slice/crio-11176b2624068f064077e9bf2f2268b44fe86d9cc335351bf51914d6031a3905 WatchSource:0}: Error finding container 11176b2624068f064077e9bf2f2268b44fe86d9cc335351bf51914d6031a3905: Status 404 returned error can't find the container with id 11176b2624068f064077e9bf2f2268b44fe86d9cc335351bf51914d6031a3905 Dec 02 18:35:49 crc kubenswrapper[4878]: I1202 18:35:49.820504 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:35:49 crc kubenswrapper[4878]: I1202 18:35:49.834117 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.010484 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.062372 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-dns-svc\") pod \"3eb57016-f541-49f8-92d4-4bc95a8a9396\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.062467 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbvzs\" (UniqueName: \"kubernetes.io/projected/3eb57016-f541-49f8-92d4-4bc95a8a9396-kube-api-access-vbvzs\") pod \"3eb57016-f541-49f8-92d4-4bc95a8a9396\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.062693 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-config\") pod \"3eb57016-f541-49f8-92d4-4bc95a8a9396\" (UID: \"3eb57016-f541-49f8-92d4-4bc95a8a9396\") " Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.064364 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-config" (OuterVolumeSpecName: "config") pod "3eb57016-f541-49f8-92d4-4bc95a8a9396" (UID: "3eb57016-f541-49f8-92d4-4bc95a8a9396"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.064793 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3eb57016-f541-49f8-92d4-4bc95a8a9396" (UID: "3eb57016-f541-49f8-92d4-4bc95a8a9396"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.071864 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb57016-f541-49f8-92d4-4bc95a8a9396-kube-api-access-vbvzs" (OuterVolumeSpecName: "kube-api-access-vbvzs") pod "3eb57016-f541-49f8-92d4-4bc95a8a9396" (UID: "3eb57016-f541-49f8-92d4-4bc95a8a9396"). InnerVolumeSpecName "kube-api-access-vbvzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.132851 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.165180 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqnth\" (UniqueName: \"kubernetes.io/projected/f7553ad2-b8a5-4794-951e-cbb20d1f0426-kube-api-access-tqnth\") pod \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.165462 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7553ad2-b8a5-4794-951e-cbb20d1f0426-config\") pod \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\" (UID: \"f7553ad2-b8a5-4794-951e-cbb20d1f0426\") " Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.166098 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbvzs\" (UniqueName: \"kubernetes.io/projected/3eb57016-f541-49f8-92d4-4bc95a8a9396-kube-api-access-vbvzs\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.166134 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.166152 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb57016-f541-49f8-92d4-4bc95a8a9396-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.166141 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7553ad2-b8a5-4794-951e-cbb20d1f0426-config" (OuterVolumeSpecName: "config") pod "f7553ad2-b8a5-4794-951e-cbb20d1f0426" (UID: "f7553ad2-b8a5-4794-951e-cbb20d1f0426"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.207048 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q"] Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.267267 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7553ad2-b8a5-4794-951e-cbb20d1f0426-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.267662 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7553ad2-b8a5-4794-951e-cbb20d1f0426-kube-api-access-tqnth" (OuterVolumeSpecName: "kube-api-access-tqnth") pod "f7553ad2-b8a5-4794-951e-cbb20d1f0426" (UID: "f7553ad2-b8a5-4794-951e-cbb20d1f0426"). InnerVolumeSpecName "kube-api-access-tqnth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:35:50 crc kubenswrapper[4878]: W1202 18:35:50.276339 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod528e5b70_2773_48c2_8382_d4e2ec45933d.slice/crio-8dd001f5b920e7127689a356a066a072ebb5533f04c1bcaa806b40e10beb094d WatchSource:0}: Error finding container 8dd001f5b920e7127689a356a066a072ebb5533f04c1bcaa806b40e10beb094d: Status 404 returned error can't find the container with id 8dd001f5b920e7127689a356a066a072ebb5533f04c1bcaa806b40e10beb094d Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.324044 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" event={"ID":"528e5b70-2773-48c2-8382-d4e2ec45933d","Type":"ContainerStarted","Data":"8dd001f5b920e7127689a356a066a072ebb5533f04c1bcaa806b40e10beb094d"} Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.329275 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qgmlw" event={"ID":"1f049ebe-547b-40a2-8468-932cfc5051ea","Type":"ContainerStarted","Data":"c568240aba670013527aa73b02f9816b8d1ea6e7210be75595cf3d3fd3f83d23"} Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.330388 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" event={"ID":"3eb57016-f541-49f8-92d4-4bc95a8a9396","Type":"ContainerDied","Data":"1795f95a2d887108c9e52034fabf8bd31c8d64b0c99a3f7a25ee0dd06ca653c4"} Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.330775 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-k9l8c" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.332453 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" event={"ID":"f7553ad2-b8a5-4794-951e-cbb20d1f0426","Type":"ContainerDied","Data":"13d16fc56c352239c23561700689d59aebed8f793cad2359aa718f7440fa946f"} Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.332554 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-k8mqd" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.338802 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3bc2339a-313d-485a-b67f-d18b597c36e5","Type":"ContainerStarted","Data":"11176b2624068f064077e9bf2f2268b44fe86d9cc335351bf51914d6031a3905"} Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.339940 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerStarted","Data":"d58073eef0db504a0dd8d7cef1e0b92706f60a8d5fc6707879c6072624f1343d"} Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.369349 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqnth\" (UniqueName: \"kubernetes.io/projected/f7553ad2-b8a5-4794-951e-cbb20d1f0426-kube-api-access-tqnth\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.428471 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k9l8c"] Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.441325 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-k9l8c"] Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.457060 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-k8mqd"] Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.464379 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-k8mqd"] Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.900934 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.964383 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb57016-f541-49f8-92d4-4bc95a8a9396" path="/var/lib/kubelet/pods/3eb57016-f541-49f8-92d4-4bc95a8a9396/volumes" Dec 02 18:35:50 crc kubenswrapper[4878]: I1202 18:35:50.964923 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7553ad2-b8a5-4794-951e-cbb20d1f0426" path="/var/lib/kubelet/pods/f7553ad2-b8a5-4794-951e-cbb20d1f0426/volumes" Dec 02 18:35:51 crc kubenswrapper[4878]: E1202 18:35:51.024506 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 18:35:51 crc kubenswrapper[4878]: E1202 18:35:51.024650 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bfjcl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-j459b_openstack(3aab598f-4e89-4790-a425-ab7983be07c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:35:51 crc kubenswrapper[4878]: E1202 18:35:51.025813 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" podUID="3aab598f-4e89-4790-a425-ab7983be07c4" Dec 02 18:35:51 crc kubenswrapper[4878]: I1202 18:35:51.226478 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dsnc6"] Dec 02 18:35:51 crc kubenswrapper[4878]: I1202 18:35:51.349983 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c436c198-1049-416f-9ab7-33261ff55ab4","Type":"ContainerStarted","Data":"cefa40ebdf79dc053f7a2c0a46632692d9055cd3c27efa2e826e40bb4e3f6e14"} Dec 02 18:35:51 crc kubenswrapper[4878]: I1202 18:35:51.351734 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsnc6" event={"ID":"f1f92026-f0b1-470f-885e-914fece7f4e3","Type":"ContainerStarted","Data":"ab46bc0409118ca378780711a7dcc36fda6274d7f65510377e4d68fca76d3781"} Dec 02 18:35:51 crc kubenswrapper[4878]: I1202 18:35:51.353212 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4","Type":"ContainerStarted","Data":"7fff0d82cc292b6da35f727ba0d686752e8842c39174a3ab835f428689ab5626"} Dec 02 18:35:51 crc kubenswrapper[4878]: E1202 18:35:51.357596 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" podUID="3aab598f-4e89-4790-a425-ab7983be07c4" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.376169 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8ce834c-073d-4062-b3ee-488fa79aae4f","Type":"ContainerStarted","Data":"33bf4bb167f90d4156d2bdcea9dab49b9f0ec8a779f82ec97a18a80fccfefc6a"} Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.386949 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57568cf57-qq2hl" event={"ID":"409d93c8-2950-4081-b5c8-9c3435e28449","Type":"ContainerStarted","Data":"5773588b304397e2b316e04764b9e4b1f999c9d3f534318a2a95daba78972cea"} Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.405167 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9cae31f5-acb4-423b-8a14-4136afb73062","Type":"ContainerStarted","Data":"3343d039a4f89cf2d5e147232e79fb0554d0b6800bd1406acc3d4ff5ca10c755"} Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.440417 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"168172d2-5cc8-492f-aa26-bd2a1351cdf2","Type":"ContainerStarted","Data":"f47b77d298db6e299054ba2f8bdad64cb6c8cb2bbfbcbc308213e7b1e3765941"} Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.528874 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-2pqpm"] Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.530571 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.541386 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2pqpm"] Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.548119 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.650967 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f833875c-c0f5-4654-b592-14d4a6161df6-config\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.651146 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f833875c-c0f5-4654-b592-14d4a6161df6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.651266 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833875c-c0f5-4654-b592-14d4a6161df6-combined-ca-bundle\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.651398 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9xvg\" (UniqueName: \"kubernetes.io/projected/f833875c-c0f5-4654-b592-14d4a6161df6-kube-api-access-v9xvg\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.651442 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f833875c-c0f5-4654-b592-14d4a6161df6-ovs-rundir\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.651469 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f833875c-c0f5-4654-b592-14d4a6161df6-ovn-rundir\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.688044 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57568cf57-qq2hl" podStartSLOduration=26.688016169 podStartE2EDuration="26.688016169s" podCreationTimestamp="2025-12-02 18:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:35:52.6554655 +0000 UTC m=+1262.345084381" watchObservedRunningTime="2025-12-02 18:35:52.688016169 +0000 UTC m=+1262.377635050" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.735711 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-54gkd"] Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.760597 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f833875c-c0f5-4654-b592-14d4a6161df6-config\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.760721 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f833875c-c0f5-4654-b592-14d4a6161df6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.760800 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833875c-c0f5-4654-b592-14d4a6161df6-combined-ca-bundle\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.760851 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9xvg\" (UniqueName: \"kubernetes.io/projected/f833875c-c0f5-4654-b592-14d4a6161df6-kube-api-access-v9xvg\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.760874 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f833875c-c0f5-4654-b592-14d4a6161df6-ovs-rundir\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.760901 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f833875c-c0f5-4654-b592-14d4a6161df6-ovn-rundir\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.761291 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f833875c-c0f5-4654-b592-14d4a6161df6-ovn-rundir\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.761680 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f833875c-c0f5-4654-b592-14d4a6161df6-config\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.767653 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f833875c-c0f5-4654-b592-14d4a6161df6-ovs-rundir\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.775447 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f833875c-c0f5-4654-b592-14d4a6161df6-combined-ca-bundle\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.787951 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-96mh2"] Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.789865 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.790939 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f833875c-c0f5-4654-b592-14d4a6161df6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.792492 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.806013 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-96mh2"] Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.822052 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9xvg\" (UniqueName: \"kubernetes.io/projected/f833875c-c0f5-4654-b592-14d4a6161df6-kube-api-access-v9xvg\") pod \"ovn-controller-metrics-2pqpm\" (UID: \"f833875c-c0f5-4654-b592-14d4a6161df6\") " pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.863729 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-config\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.863785 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.863842 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbw7r\" (UniqueName: \"kubernetes.io/projected/216d315b-5b55-494e-9ead-19d081f50952-kube-api-access-kbw7r\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.863908 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.878791 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-2pqpm" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.953468 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j459b"] Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.971483 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.973654 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-config\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.973727 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.973849 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbw7r\" (UniqueName: \"kubernetes.io/projected/216d315b-5b55-494e-9ead-19d081f50952-kube-api-access-kbw7r\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.975922 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.976129 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-config\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:52 crc kubenswrapper[4878]: I1202 18:35:52.979917 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.014866 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2mkl9"] Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.016965 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.017974 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbw7r\" (UniqueName: \"kubernetes.io/projected/216d315b-5b55-494e-9ead-19d081f50952-kube-api-access-kbw7r\") pod \"dnsmasq-dns-7fd796d7df-96mh2\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.022687 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.082318 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2mkl9"] Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.180982 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.181056 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-config\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.181602 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2zcj\" (UniqueName: \"kubernetes.io/projected/13c6db70-cc96-462a-a27c-496e1041fcbb-kube-api-access-n2zcj\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.181768 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.181880 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.211857 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.284790 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2zcj\" (UniqueName: \"kubernetes.io/projected/13c6db70-cc96-462a-a27c-496e1041fcbb-kube-api-access-n2zcj\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.284858 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.284912 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.284959 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.284994 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-config\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.286687 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-config\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.286780 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.287555 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.288070 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.304596 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2zcj\" (UniqueName: \"kubernetes.io/projected/13c6db70-cc96-462a-a27c-496e1041fcbb-kube-api-access-n2zcj\") pod \"dnsmasq-dns-86db49b7ff-2mkl9\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.319370 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.387074 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-config\") pod \"63448588-8af2-4397-981d-97ba5bf4170d\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.389413 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-dns-svc\") pod \"63448588-8af2-4397-981d-97ba5bf4170d\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.389879 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vc8c\" (UniqueName: \"kubernetes.io/projected/63448588-8af2-4397-981d-97ba5bf4170d-kube-api-access-8vc8c\") pod \"63448588-8af2-4397-981d-97ba5bf4170d\" (UID: \"63448588-8af2-4397-981d-97ba5bf4170d\") " Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.387725 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-config" (OuterVolumeSpecName: "config") pod "63448588-8af2-4397-981d-97ba5bf4170d" (UID: "63448588-8af2-4397-981d-97ba5bf4170d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.390092 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63448588-8af2-4397-981d-97ba5bf4170d" (UID: "63448588-8af2-4397-981d-97ba5bf4170d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.390932 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.391051 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63448588-8af2-4397-981d-97ba5bf4170d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.400922 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.463738 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" event={"ID":"63448588-8af2-4397-981d-97ba5bf4170d","Type":"ContainerDied","Data":"3aff425cd3d4d897d7d028893fb97040958e8426df2ab532e5d6bf1f1b6a23a8"} Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.463904 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-54gkd" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.482514 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63448588-8af2-4397-981d-97ba5bf4170d-kube-api-access-8vc8c" (OuterVolumeSpecName: "kube-api-access-8vc8c") pod "63448588-8af2-4397-981d-97ba5bf4170d" (UID: "63448588-8af2-4397-981d-97ba5bf4170d"). InnerVolumeSpecName "kube-api-access-8vc8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.493599 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vc8c\" (UniqueName: \"kubernetes.io/projected/63448588-8af2-4397-981d-97ba5bf4170d-kube-api-access-8vc8c\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.743457 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.743522 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.743573 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.744763 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26923c15a81965f0afaf8fe206a0c93db8beb9097433b36b1189c363d7056d26"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.744819 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://26923c15a81965f0afaf8fe206a0c93db8beb9097433b36b1189c363d7056d26" gracePeriod=600 Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.795769 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.802108 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-dns-svc\") pod \"3aab598f-4e89-4790-a425-ab7983be07c4\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.802302 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfjcl\" (UniqueName: \"kubernetes.io/projected/3aab598f-4e89-4790-a425-ab7983be07c4-kube-api-access-bfjcl\") pod \"3aab598f-4e89-4790-a425-ab7983be07c4\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.802371 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-config\") pod \"3aab598f-4e89-4790-a425-ab7983be07c4\" (UID: \"3aab598f-4e89-4790-a425-ab7983be07c4\") " Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.803402 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3aab598f-4e89-4790-a425-ab7983be07c4" (UID: "3aab598f-4e89-4790-a425-ab7983be07c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.809967 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aab598f-4e89-4790-a425-ab7983be07c4-kube-api-access-bfjcl" (OuterVolumeSpecName: "kube-api-access-bfjcl") pod "3aab598f-4e89-4790-a425-ab7983be07c4" (UID: "3aab598f-4e89-4790-a425-ab7983be07c4"). InnerVolumeSpecName "kube-api-access-bfjcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.810195 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-config" (OuterVolumeSpecName: "config") pod "3aab598f-4e89-4790-a425-ab7983be07c4" (UID: "3aab598f-4e89-4790-a425-ab7983be07c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.888016 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-2pqpm"] Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.914628 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2mkl9"] Dec 02 18:35:53 crc kubenswrapper[4878]: W1202 18:35:53.917942 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf833875c_c0f5_4654_b592_14d4a6161df6.slice/crio-b899cf4c7e1f996dec6c8ab59a456815e5ec236c77be0767232a50a093523ff0 WatchSource:0}: Error finding container b899cf4c7e1f996dec6c8ab59a456815e5ec236c77be0767232a50a093523ff0: Status 404 returned error can't find the container with id b899cf4c7e1f996dec6c8ab59a456815e5ec236c77be0767232a50a093523ff0 Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.918924 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfjcl\" (UniqueName: \"kubernetes.io/projected/3aab598f-4e89-4790-a425-ab7983be07c4-kube-api-access-bfjcl\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.919025 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:53 crc kubenswrapper[4878]: I1202 18:35:53.919084 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aab598f-4e89-4790-a425-ab7983be07c4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:35:53 crc kubenswrapper[4878]: W1202 18:35:53.928426 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c6db70_cc96_462a_a27c_496e1041fcbb.slice/crio-c620c3ebefe5d7e8da71bc4ae65301e1fc0da4d04116a446481e9b15da75302d WatchSource:0}: Error finding container c620c3ebefe5d7e8da71bc4ae65301e1fc0da4d04116a446481e9b15da75302d: Status 404 returned error can't find the container with id c620c3ebefe5d7e8da71bc4ae65301e1fc0da4d04116a446481e9b15da75302d Dec 02 18:35:54 crc kubenswrapper[4878]: W1202 18:35:54.041027 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod216d315b_5b55_494e_9ead_19d081f50952.slice/crio-486eef724fae0d574448ea643f63ae740ffbe218fcfbbf527e784eeded8cb595 WatchSource:0}: Error finding container 486eef724fae0d574448ea643f63ae740ffbe218fcfbbf527e784eeded8cb595: Status 404 returned error can't find the container with id 486eef724fae0d574448ea643f63ae740ffbe218fcfbbf527e784eeded8cb595 Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.055362 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-54gkd"] Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.100919 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-54gkd"] Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.125025 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-96mh2"] Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.481723 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" event={"ID":"3aab598f-4e89-4790-a425-ab7983be07c4","Type":"ContainerDied","Data":"055c534b6dfd85cdd507b06e1da65688365dae2dbac6c54820c1babfc239f8cd"} Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.481768 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j459b" Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.484183 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" event={"ID":"216d315b-5b55-494e-9ead-19d081f50952","Type":"ContainerStarted","Data":"486eef724fae0d574448ea643f63ae740ffbe218fcfbbf527e784eeded8cb595"} Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.487811 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="26923c15a81965f0afaf8fe206a0c93db8beb9097433b36b1189c363d7056d26" exitCode=0 Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.487859 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"26923c15a81965f0afaf8fe206a0c93db8beb9097433b36b1189c363d7056d26"} Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.487914 4878 scope.go:117] "RemoveContainer" containerID="e692d5eeca0be5391ffb074305f4ee4fcb35693cb015b4c1a01e012767df5a57" Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.490091 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2pqpm" event={"ID":"f833875c-c0f5-4654-b592-14d4a6161df6","Type":"ContainerStarted","Data":"b899cf4c7e1f996dec6c8ab59a456815e5ec236c77be0767232a50a093523ff0"} Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.492109 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" event={"ID":"13c6db70-cc96-462a-a27c-496e1041fcbb","Type":"ContainerStarted","Data":"c620c3ebefe5d7e8da71bc4ae65301e1fc0da4d04116a446481e9b15da75302d"} Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.557513 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j459b"] Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.563331 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j459b"] Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.955496 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aab598f-4e89-4790-a425-ab7983be07c4" path="/var/lib/kubelet/pods/3aab598f-4e89-4790-a425-ab7983be07c4/volumes" Dec 02 18:35:54 crc kubenswrapper[4878]: I1202 18:35:54.956351 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63448588-8af2-4397-981d-97ba5bf4170d" path="/var/lib/kubelet/pods/63448588-8af2-4397-981d-97ba5bf4170d/volumes" Dec 02 18:35:55 crc kubenswrapper[4878]: I1202 18:35:55.506072 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"f262fb1e8290073f98cb5506d7d41d0ed3eb91d64ad36acc8496dd8b9fa35544"} Dec 02 18:35:56 crc kubenswrapper[4878]: I1202 18:35:56.714757 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:56 crc kubenswrapper[4878]: I1202 18:35:56.715554 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:56 crc kubenswrapper[4878]: I1202 18:35:56.724309 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:57 crc kubenswrapper[4878]: I1202 18:35:57.538602 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57568cf57-qq2hl" Dec 02 18:35:57 crc kubenswrapper[4878]: I1202 18:35:57.623872 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59b7b6c866-l2c75"] Dec 02 18:36:00 crc kubenswrapper[4878]: I1202 18:36:00.570328 4878 generic.go:334] "Generic (PLEG): container finished" podID="c436c198-1049-416f-9ab7-33261ff55ab4" containerID="cefa40ebdf79dc053f7a2c0a46632692d9055cd3c27efa2e826e40bb4e3f6e14" exitCode=0 Dec 02 18:36:00 crc kubenswrapper[4878]: I1202 18:36:00.570407 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c436c198-1049-416f-9ab7-33261ff55ab4","Type":"ContainerDied","Data":"cefa40ebdf79dc053f7a2c0a46632692d9055cd3c27efa2e826e40bb4e3f6e14"} Dec 02 18:36:00 crc kubenswrapper[4878]: I1202 18:36:00.575817 4878 generic.go:334] "Generic (PLEG): container finished" podID="9cae31f5-acb4-423b-8a14-4136afb73062" containerID="3343d039a4f89cf2d5e147232e79fb0554d0b6800bd1406acc3d4ff5ca10c755" exitCode=0 Dec 02 18:36:00 crc kubenswrapper[4878]: I1202 18:36:00.575902 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9cae31f5-acb4-423b-8a14-4136afb73062","Type":"ContainerDied","Data":"3343d039a4f89cf2d5e147232e79fb0554d0b6800bd1406acc3d4ff5ca10c755"} Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.623021 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsnc6" event={"ID":"f1f92026-f0b1-470f-885e-914fece7f4e3","Type":"ContainerStarted","Data":"42c1fb367ea229a203c6d3c1f1641a2818eb61dcc1e39ccaac6456930edd15d3"} Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.628460 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c436c198-1049-416f-9ab7-33261ff55ab4","Type":"ContainerStarted","Data":"aff295ffa945e70ed5f0c267a265d36d5a49d1370f66cd970f9f7ad97be3b1fc"} Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.628497 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" event={"ID":"528e5b70-2773-48c2-8382-d4e2ec45933d","Type":"ContainerStarted","Data":"6c59c138cf89aa3fbffef6d6a6f69787daeb9427caad5326ff28d68dcab2abda"} Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.630846 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9cae31f5-acb4-423b-8a14-4136afb73062","Type":"ContainerStarted","Data":"4dcd8c8cb17d03f7a0638976ceb009a2e63d60db3668508a349b9d9e7f4607aa"} Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.632937 4878 generic.go:334] "Generic (PLEG): container finished" podID="216d315b-5b55-494e-9ead-19d081f50952" containerID="c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887" exitCode=0 Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.632977 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" event={"ID":"216d315b-5b55-494e-9ead-19d081f50952","Type":"ContainerDied","Data":"c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887"} Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.691362 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-zhq9q" podStartSLOduration=27.184462528 podStartE2EDuration="39.691340082s" podCreationTimestamp="2025-12-02 18:35:25 +0000 UTC" firstStartedPulling="2025-12-02 18:35:50.278089033 +0000 UTC m=+1259.967707914" lastFinishedPulling="2025-12-02 18:36:02.784966587 +0000 UTC m=+1272.474585468" observedRunningTime="2025-12-02 18:36:04.659162853 +0000 UTC m=+1274.348781734" watchObservedRunningTime="2025-12-02 18:36:04.691340082 +0000 UTC m=+1274.380958963" Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.738490 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.511998004 podStartE2EDuration="44.738466783s" podCreationTimestamp="2025-12-02 18:35:20 +0000 UTC" firstStartedPulling="2025-12-02 18:35:22.212072442 +0000 UTC m=+1231.901691323" lastFinishedPulling="2025-12-02 18:35:48.438541221 +0000 UTC m=+1258.128160102" observedRunningTime="2025-12-02 18:36:04.737681689 +0000 UTC m=+1274.427300580" watchObservedRunningTime="2025-12-02 18:36:04.738466783 +0000 UTC m=+1274.428085664" Dec 02 18:36:04 crc kubenswrapper[4878]: I1202 18:36:04.750982 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.59953706 podStartE2EDuration="43.75095655s" podCreationTimestamp="2025-12-02 18:35:21 +0000 UTC" firstStartedPulling="2025-12-02 18:35:23.515970929 +0000 UTC m=+1233.205589810" lastFinishedPulling="2025-12-02 18:35:48.667390419 +0000 UTC m=+1258.357009300" observedRunningTime="2025-12-02 18:36:04.716636936 +0000 UTC m=+1274.406255827" watchObservedRunningTime="2025-12-02 18:36:04.75095655 +0000 UTC m=+1274.440575431" Dec 02 18:36:10 crc kubenswrapper[4878]: I1202 18:36:10.696201 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qgmlw" event={"ID":"1f049ebe-547b-40a2-8468-932cfc5051ea","Type":"ContainerStarted","Data":"afc463b62c68f4ac582b023018ac88683e06b8d85a819dbab0961ac2e965c386"} Dec 02 18:36:11 crc kubenswrapper[4878]: I1202 18:36:11.498430 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 18:36:11 crc kubenswrapper[4878]: I1202 18:36:11.498490 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 18:36:11 crc kubenswrapper[4878]: E1202 18:36:11.623824 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 18:36:11 crc kubenswrapper[4878]: E1202 18:36:11.623917 4878 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 02 18:36:11 crc kubenswrapper[4878]: E1202 18:36:11.624156 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hlzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(3bc2339a-313d-485a-b67f-d18b597c36e5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 18:36:11 crc kubenswrapper[4878]: E1202 18:36:11.625379 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="3bc2339a-313d-485a-b67f-d18b597c36e5" Dec 02 18:36:11 crc kubenswrapper[4878]: I1202 18:36:11.713459 4878 generic.go:334] "Generic (PLEG): container finished" podID="f1f92026-f0b1-470f-885e-914fece7f4e3" containerID="42c1fb367ea229a203c6d3c1f1641a2818eb61dcc1e39ccaac6456930edd15d3" exitCode=0 Dec 02 18:36:11 crc kubenswrapper[4878]: I1202 18:36:11.713540 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsnc6" event={"ID":"f1f92026-f0b1-470f-885e-914fece7f4e3","Type":"ContainerDied","Data":"42c1fb367ea229a203c6d3c1f1641a2818eb61dcc1e39ccaac6456930edd15d3"} Dec 02 18:36:11 crc kubenswrapper[4878]: I1202 18:36:11.717564 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerStarted","Data":"74ead525d39898e37e26b20a9217ccfdad6d91900f12790e458abb17c1b52507"} Dec 02 18:36:11 crc kubenswrapper[4878]: I1202 18:36:11.717744 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qgmlw" Dec 02 18:36:11 crc kubenswrapper[4878]: E1202 18:36:11.728934 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="3bc2339a-313d-485a-b67f-d18b597c36e5" Dec 02 18:36:11 crc kubenswrapper[4878]: I1202 18:36:11.778324 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qgmlw" podStartSLOduration=30.12622873 podStartE2EDuration="43.778302657s" podCreationTimestamp="2025-12-02 18:35:28 +0000 UTC" firstStartedPulling="2025-12-02 18:35:49.809655802 +0000 UTC m=+1259.499274683" lastFinishedPulling="2025-12-02 18:36:03.461729729 +0000 UTC m=+1273.151348610" observedRunningTime="2025-12-02 18:36:11.767745899 +0000 UTC m=+1281.457364780" watchObservedRunningTime="2025-12-02 18:36:11.778302657 +0000 UTC m=+1281.467921538" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.589591 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.590267 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.729009 4878 generic.go:334] "Generic (PLEG): container finished" podID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerID="e402933f38a0cc3761a66c5fa54c718c2d30cc83ab77a8637a5ac0337fa6bb44" exitCode=0 Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.729081 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" event={"ID":"13c6db70-cc96-462a-a27c-496e1041fcbb","Type":"ContainerDied","Data":"e402933f38a0cc3761a66c5fa54c718c2d30cc83ab77a8637a5ac0337fa6bb44"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.734009 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsnc6" event={"ID":"f1f92026-f0b1-470f-885e-914fece7f4e3","Type":"ContainerStarted","Data":"19a6b14405616e5b15ca7ef5074a8f0218c2521c8eec04b49ad2314e50e2528c"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.734069 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsnc6" event={"ID":"f1f92026-f0b1-470f-885e-914fece7f4e3","Type":"ContainerStarted","Data":"5d7db24677743cda4f8e5ef3278d0a736eae976d63c8bde83e909e98fcec52ab"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.734615 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.734714 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.739741 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4","Type":"ContainerStarted","Data":"42b368e44a60554e454ec816be5901cc4576b7f54aee4865b7e98d63ec6dbc14"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.739812 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4","Type":"ContainerStarted","Data":"1827ef30638a4d2734d85c94759ba32e2913b7cfbece641dfbbea71673e4ba46"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.741746 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"427bac6f-5bf8-4f40-a0f6-fea0cede315f","Type":"ContainerStarted","Data":"c759920302775af90faa0b160da72eb5410c3730aaf00a1c26ffcb058d5a9d8d"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.742278 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.744001 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a6aad750-71cc-4815-906a-5f2a130875e8","Type":"ContainerStarted","Data":"c6257e403dfcdaf9afe0a58b43e729de4a5a54c8c8ea5f8e688b3ea9ab3d5c76"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.744041 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a6aad750-71cc-4815-906a-5f2a130875e8","Type":"ContainerStarted","Data":"cd63b5e1a04e0abc7db8ee03b2588090416cdbcb4abc1954d9abcc313f050e12"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.747767 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" event={"ID":"216d315b-5b55-494e-9ead-19d081f50952","Type":"ContainerStarted","Data":"2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.748445 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.751053 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-2pqpm" event={"ID":"f833875c-c0f5-4654-b592-14d4a6161df6","Type":"ContainerStarted","Data":"46bb68db0f32e947848e50101f866d0f91b2965a2b4845e08282de5cf8c77236"} Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.793763 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dsnc6" podStartSLOduration=33.816846723 podStartE2EDuration="44.793728606s" podCreationTimestamp="2025-12-02 18:35:28 +0000 UTC" firstStartedPulling="2025-12-02 18:35:51.223929093 +0000 UTC m=+1260.913547974" lastFinishedPulling="2025-12-02 18:36:02.200810956 +0000 UTC m=+1271.890429857" observedRunningTime="2025-12-02 18:36:12.786765359 +0000 UTC m=+1282.476384240" watchObservedRunningTime="2025-12-02 18:36:12.793728606 +0000 UTC m=+1282.483347487" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.798418 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.831921 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=30.866249034 podStartE2EDuration="44.831901659s" podCreationTimestamp="2025-12-02 18:35:28 +0000 UTC" firstStartedPulling="2025-12-02 18:35:48.44111158 +0000 UTC m=+1258.130730471" lastFinishedPulling="2025-12-02 18:36:02.406764215 +0000 UTC m=+1272.096383096" observedRunningTime="2025-12-02 18:36:12.807587345 +0000 UTC m=+1282.497206226" watchObservedRunningTime="2025-12-02 18:36:12.831901659 +0000 UTC m=+1282.521520540" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.840580 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" podStartSLOduration=13.817522653 podStartE2EDuration="20.840558478s" podCreationTimestamp="2025-12-02 18:35:52 +0000 UTC" firstStartedPulling="2025-12-02 18:35:54.066708186 +0000 UTC m=+1263.756327067" lastFinishedPulling="2025-12-02 18:36:01.089744011 +0000 UTC m=+1270.779362892" observedRunningTime="2025-12-02 18:36:12.836858103 +0000 UTC m=+1282.526476984" watchObservedRunningTime="2025-12-02 18:36:12.840558478 +0000 UTC m=+1282.530177359" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.855591 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.11565592 podStartE2EDuration="50.855565484s" podCreationTimestamp="2025-12-02 18:35:22 +0000 UTC" firstStartedPulling="2025-12-02 18:35:24.267034117 +0000 UTC m=+1233.956652998" lastFinishedPulling="2025-12-02 18:36:04.006943681 +0000 UTC m=+1273.696562562" observedRunningTime="2025-12-02 18:36:12.852957872 +0000 UTC m=+1282.542576753" watchObservedRunningTime="2025-12-02 18:36:12.855565484 +0000 UTC m=+1282.545184365" Dec 02 18:36:12 crc kubenswrapper[4878]: I1202 18:36:12.887560 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-2pqpm" podStartSLOduration=10.808984859 podStartE2EDuration="20.887535645s" podCreationTimestamp="2025-12-02 18:35:52 +0000 UTC" firstStartedPulling="2025-12-02 18:35:53.928411766 +0000 UTC m=+1263.618030647" lastFinishedPulling="2025-12-02 18:36:04.006962552 +0000 UTC m=+1273.696581433" observedRunningTime="2025-12-02 18:36:12.875825242 +0000 UTC m=+1282.565444123" watchObservedRunningTime="2025-12-02 18:36:12.887535645 +0000 UTC m=+1282.577154526" Dec 02 18:36:13 crc kubenswrapper[4878]: I1202 18:36:13.768957 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" event={"ID":"13c6db70-cc96-462a-a27c-496e1041fcbb","Type":"ContainerStarted","Data":"46cd5b370e44d655f74e2cd3dcfa59274b3b98e5704238221da3bf0bcb342787"} Dec 02 18:36:13 crc kubenswrapper[4878]: I1202 18:36:13.801209 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" podStartSLOduration=12.658338246 podStartE2EDuration="21.801181746s" podCreationTimestamp="2025-12-02 18:35:52 +0000 UTC" firstStartedPulling="2025-12-02 18:35:53.934780104 +0000 UTC m=+1263.624398985" lastFinishedPulling="2025-12-02 18:36:03.077623604 +0000 UTC m=+1272.767242485" observedRunningTime="2025-12-02 18:36:13.794973864 +0000 UTC m=+1283.484592745" watchObservedRunningTime="2025-12-02 18:36:13.801181746 +0000 UTC m=+1283.490800627" Dec 02 18:36:13 crc kubenswrapper[4878]: I1202 18:36:13.801812 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=30.633739974 podStartE2EDuration="42.801804176s" podCreationTimestamp="2025-12-02 18:35:31 +0000 UTC" firstStartedPulling="2025-12-02 18:35:50.909757378 +0000 UTC m=+1260.599376269" lastFinishedPulling="2025-12-02 18:36:03.07782159 +0000 UTC m=+1272.767440471" observedRunningTime="2025-12-02 18:36:12.917114873 +0000 UTC m=+1282.606733754" watchObservedRunningTime="2025-12-02 18:36:13.801804176 +0000 UTC m=+1283.491423057" Dec 02 18:36:14 crc kubenswrapper[4878]: I1202 18:36:14.781980 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:36:14 crc kubenswrapper[4878]: I1202 18:36:14.796512 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 18:36:15 crc kubenswrapper[4878]: I1202 18:36:15.090509 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 18:36:15 crc kubenswrapper[4878]: I1202 18:36:15.090610 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 18:36:15 crc kubenswrapper[4878]: I1202 18:36:15.156297 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 18:36:15 crc kubenswrapper[4878]: I1202 18:36:15.649404 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 18:36:15 crc kubenswrapper[4878]: I1202 18:36:15.737530 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 18:36:16 crc kubenswrapper[4878]: I1202 18:36:16.694682 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 18:36:16 crc kubenswrapper[4878]: I1202 18:36:16.880281 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 18:36:17 crc kubenswrapper[4878]: I1202 18:36:17.841805 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 18:36:17 crc kubenswrapper[4878]: I1202 18:36:17.922385 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.214869 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.402943 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.469857 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-96mh2"] Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.536423 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.840364 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-xzscw"] Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.842309 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xzscw" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.860445 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-95c8-account-create-update-szl82"] Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.860754 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" podUID="216d315b-5b55-494e-9ead-19d081f50952" containerName="dnsmasq-dns" containerID="cri-o://2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86" gracePeriod=10 Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.862996 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.865062 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.876722 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xzscw"] Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.898460 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-95c8-account-create-update-szl82"] Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.981406 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4f4r\" (UniqueName: \"kubernetes.io/projected/41728d08-e232-4713-ba93-162894335f4c-kube-api-access-c4f4r\") pod \"glance-95c8-account-create-update-szl82\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.981512 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsws\" (UniqueName: \"kubernetes.io/projected/73844c1f-2d85-4d85-889f-90392030c02d-kube-api-access-bqsws\") pod \"glance-db-create-xzscw\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " pod="openstack/glance-db-create-xzscw" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.981573 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73844c1f-2d85-4d85-889f-90392030c02d-operator-scripts\") pod \"glance-db-create-xzscw\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " pod="openstack/glance-db-create-xzscw" Dec 02 18:36:18 crc kubenswrapper[4878]: I1202 18:36:18.981594 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41728d08-e232-4713-ba93-162894335f4c-operator-scripts\") pod \"glance-95c8-account-create-update-szl82\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.085081 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4f4r\" (UniqueName: \"kubernetes.io/projected/41728d08-e232-4713-ba93-162894335f4c-kube-api-access-c4f4r\") pod \"glance-95c8-account-create-update-szl82\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.085587 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsws\" (UniqueName: \"kubernetes.io/projected/73844c1f-2d85-4d85-889f-90392030c02d-kube-api-access-bqsws\") pod \"glance-db-create-xzscw\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " pod="openstack/glance-db-create-xzscw" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.085670 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73844c1f-2d85-4d85-889f-90392030c02d-operator-scripts\") pod \"glance-db-create-xzscw\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " pod="openstack/glance-db-create-xzscw" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.085695 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41728d08-e232-4713-ba93-162894335f4c-operator-scripts\") pod \"glance-95c8-account-create-update-szl82\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.086901 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41728d08-e232-4713-ba93-162894335f4c-operator-scripts\") pod \"glance-95c8-account-create-update-szl82\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.090938 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73844c1f-2d85-4d85-889f-90392030c02d-operator-scripts\") pod \"glance-db-create-xzscw\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " pod="openstack/glance-db-create-xzscw" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.112583 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsws\" (UniqueName: \"kubernetes.io/projected/73844c1f-2d85-4d85-889f-90392030c02d-kube-api-access-bqsws\") pod \"glance-db-create-xzscw\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " pod="openstack/glance-db-create-xzscw" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.116809 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4f4r\" (UniqueName: \"kubernetes.io/projected/41728d08-e232-4713-ba93-162894335f4c-kube-api-access-c4f4r\") pod \"glance-95c8-account-create-update-szl82\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.214400 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xzscw" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.327090 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.528660 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.601041 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbw7r\" (UniqueName: \"kubernetes.io/projected/216d315b-5b55-494e-9ead-19d081f50952-kube-api-access-kbw7r\") pod \"216d315b-5b55-494e-9ead-19d081f50952\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.601100 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-ovsdbserver-nb\") pod \"216d315b-5b55-494e-9ead-19d081f50952\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.601194 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-dns-svc\") pod \"216d315b-5b55-494e-9ead-19d081f50952\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.601249 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-config\") pod \"216d315b-5b55-494e-9ead-19d081f50952\" (UID: \"216d315b-5b55-494e-9ead-19d081f50952\") " Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.610696 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/216d315b-5b55-494e-9ead-19d081f50952-kube-api-access-kbw7r" (OuterVolumeSpecName: "kube-api-access-kbw7r") pod "216d315b-5b55-494e-9ead-19d081f50952" (UID: "216d315b-5b55-494e-9ead-19d081f50952"). InnerVolumeSpecName "kube-api-access-kbw7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.704505 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-config" (OuterVolumeSpecName: "config") pod "216d315b-5b55-494e-9ead-19d081f50952" (UID: "216d315b-5b55-494e-9ead-19d081f50952"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.709084 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbw7r\" (UniqueName: \"kubernetes.io/projected/216d315b-5b55-494e-9ead-19d081f50952-kube-api-access-kbw7r\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.709136 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.712133 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "216d315b-5b55-494e-9ead-19d081f50952" (UID: "216d315b-5b55-494e-9ead-19d081f50952"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.737426 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "216d315b-5b55-494e-9ead-19d081f50952" (UID: "216d315b-5b55-494e-9ead-19d081f50952"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.811054 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.811485 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/216d315b-5b55-494e-9ead-19d081f50952-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.878186 4878 generic.go:334] "Generic (PLEG): container finished" podID="216d315b-5b55-494e-9ead-19d081f50952" containerID="2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86" exitCode=0 Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.878317 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" event={"ID":"216d315b-5b55-494e-9ead-19d081f50952","Type":"ContainerDied","Data":"2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86"} Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.878349 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" event={"ID":"216d315b-5b55-494e-9ead-19d081f50952","Type":"ContainerDied","Data":"486eef724fae0d574448ea643f63ae740ffbe218fcfbbf527e784eeded8cb595"} Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.878369 4878 scope.go:117] "RemoveContainer" containerID="2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.878524 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-96mh2" Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.880901 4878 generic.go:334] "Generic (PLEG): container finished" podID="80ade876-344b-415c-9609-6477205860c9" containerID="74ead525d39898e37e26b20a9217ccfdad6d91900f12790e458abb17c1b52507" exitCode=0 Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.880935 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerDied","Data":"74ead525d39898e37e26b20a9217ccfdad6d91900f12790e458abb17c1b52507"} Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.949366 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-96mh2"] Dec 02 18:36:19 crc kubenswrapper[4878]: I1202 18:36:19.960894 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-96mh2"] Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.117918 4878 scope.go:117] "RemoveContainer" containerID="c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.146902 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-xzscw"] Dec 02 18:36:20 crc kubenswrapper[4878]: W1202 18:36:20.152593 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73844c1f_2d85_4d85_889f_90392030c02d.slice/crio-284aaf711f3977e0b35a1187cdabffbcd420466c8585c468264f32d7570078b9 WatchSource:0}: Error finding container 284aaf711f3977e0b35a1187cdabffbcd420466c8585c468264f32d7570078b9: Status 404 returned error can't find the container with id 284aaf711f3977e0b35a1187cdabffbcd420466c8585c468264f32d7570078b9 Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.162568 4878 scope.go:117] "RemoveContainer" containerID="2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.162896 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 18:36:20 crc kubenswrapper[4878]: E1202 18:36:20.167289 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86\": container with ID starting with 2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86 not found: ID does not exist" containerID="2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.167327 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86"} err="failed to get container status \"2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86\": rpc error: code = NotFound desc = could not find container \"2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86\": container with ID starting with 2a3e709454188dee38933e52cb38990b9c37a09fa5029a3632459bf418bedb86 not found: ID does not exist" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.167354 4878 scope.go:117] "RemoveContainer" containerID="c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887" Dec 02 18:36:20 crc kubenswrapper[4878]: E1202 18:36:20.170523 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887\": container with ID starting with c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887 not found: ID does not exist" containerID="c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.170545 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887"} err="failed to get container status \"c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887\": rpc error: code = NotFound desc = could not find container \"c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887\": container with ID starting with c253c7375bfcebb397eda931c85a65da5ce304fff4bce6c1ba87fd7307896887 not found: ID does not exist" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.375700 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 18:36:20 crc kubenswrapper[4878]: E1202 18:36:20.376687 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216d315b-5b55-494e-9ead-19d081f50952" containerName="dnsmasq-dns" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.376718 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="216d315b-5b55-494e-9ead-19d081f50952" containerName="dnsmasq-dns" Dec 02 18:36:20 crc kubenswrapper[4878]: E1202 18:36:20.376747 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216d315b-5b55-494e-9ead-19d081f50952" containerName="init" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.376756 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="216d315b-5b55-494e-9ead-19d081f50952" containerName="init" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.377028 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="216d315b-5b55-494e-9ead-19d081f50952" containerName="dnsmasq-dns" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.378585 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.385524 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.385650 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.385826 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rsjm5" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.386132 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.390632 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.460534 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-95c8-account-create-update-szl82"] Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.534932 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.535147 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af53983f-772c-431c-95a1-af6b3d3c0edf-scripts\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.535319 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53983f-772c-431c-95a1-af6b3d3c0edf-config\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.535456 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.535547 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.535627 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af53983f-772c-431c-95a1-af6b3d3c0edf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.535741 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82qww\" (UniqueName: \"kubernetes.io/projected/af53983f-772c-431c-95a1-af6b3d3c0edf-kube-api-access-82qww\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.638023 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82qww\" (UniqueName: \"kubernetes.io/projected/af53983f-772c-431c-95a1-af6b3d3c0edf-kube-api-access-82qww\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.638167 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.638217 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af53983f-772c-431c-95a1-af6b3d3c0edf-scripts\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.638266 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53983f-772c-431c-95a1-af6b3d3c0edf-config\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.638306 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.638332 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.638359 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af53983f-772c-431c-95a1-af6b3d3c0edf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.639025 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af53983f-772c-431c-95a1-af6b3d3c0edf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.639516 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af53983f-772c-431c-95a1-af6b3d3c0edf-config\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.640459 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af53983f-772c-431c-95a1-af6b3d3c0edf-scripts\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.645674 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.645975 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.652765 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82qww\" (UniqueName: \"kubernetes.io/projected/af53983f-772c-431c-95a1-af6b3d3c0edf-kube-api-access-82qww\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.653844 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af53983f-772c-431c-95a1-af6b3d3c0edf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af53983f-772c-431c-95a1-af6b3d3c0edf\") " pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.708239 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.897768 4878 generic.go:334] "Generic (PLEG): container finished" podID="73844c1f-2d85-4d85-889f-90392030c02d" containerID="121a8d3c775386afc0556a9010b5d70a8e787bf9cf368395e979c3b2e5265e02" exitCode=0 Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.897860 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xzscw" event={"ID":"73844c1f-2d85-4d85-889f-90392030c02d","Type":"ContainerDied","Data":"121a8d3c775386afc0556a9010b5d70a8e787bf9cf368395e979c3b2e5265e02"} Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.897886 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xzscw" event={"ID":"73844c1f-2d85-4d85-889f-90392030c02d","Type":"ContainerStarted","Data":"284aaf711f3977e0b35a1187cdabffbcd420466c8585c468264f32d7570078b9"} Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.899711 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95c8-account-create-update-szl82" event={"ID":"41728d08-e232-4713-ba93-162894335f4c","Type":"ContainerStarted","Data":"3a652324c75508125ebaf6e7c282eba862f0782adf1b28f4adfaa3d2c2bf15bc"} Dec 02 18:36:20 crc kubenswrapper[4878]: I1202 18:36:20.899731 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95c8-account-create-update-szl82" event={"ID":"41728d08-e232-4713-ba93-162894335f4c","Type":"ContainerStarted","Data":"3fb3e3a023b0a1f7bb207614cee7a4c9ed5193deb7d9c4c209419ad1474dcf73"} Dec 02 18:36:21 crc kubenswrapper[4878]: I1202 18:36:20.953787 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-95c8-account-create-update-szl82" podStartSLOduration=2.9537639970000003 podStartE2EDuration="2.953763997s" podCreationTimestamp="2025-12-02 18:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:20.938197674 +0000 UTC m=+1290.627816585" watchObservedRunningTime="2025-12-02 18:36:20.953763997 +0000 UTC m=+1290.643382888" Dec 02 18:36:21 crc kubenswrapper[4878]: I1202 18:36:20.983472 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="216d315b-5b55-494e-9ead-19d081f50952" path="/var/lib/kubelet/pods/216d315b-5b55-494e-9ead-19d081f50952/volumes" Dec 02 18:36:21 crc kubenswrapper[4878]: I1202 18:36:21.295522 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 18:36:21 crc kubenswrapper[4878]: I1202 18:36:21.920045 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af53983f-772c-431c-95a1-af6b3d3c0edf","Type":"ContainerStarted","Data":"95de7bc0fdd32e6616e4c48c557184aecd90b59aa7efefe89482f20835c023b3"} Dec 02 18:36:21 crc kubenswrapper[4878]: I1202 18:36:21.923136 4878 generic.go:334] "Generic (PLEG): container finished" podID="41728d08-e232-4713-ba93-162894335f4c" containerID="3a652324c75508125ebaf6e7c282eba862f0782adf1b28f4adfaa3d2c2bf15bc" exitCode=0 Dec 02 18:36:21 crc kubenswrapper[4878]: I1202 18:36:21.923219 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95c8-account-create-update-szl82" event={"ID":"41728d08-e232-4713-ba93-162894335f4c","Type":"ContainerDied","Data":"3a652324c75508125ebaf6e7c282eba862f0782adf1b28f4adfaa3d2c2bf15bc"} Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.662963 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qswsn"] Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.665904 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.692332 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qswsn"] Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.695447 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-59b7b6c866-l2c75" podUID="090140ec-1e0c-43b4-b71e-bbe2f9d45281" containerName="console" containerID="cri-o://4d3a8810192f466bf2c48eb8e750538f1e1c5a7a09a64efdea03ed930276d53d" gracePeriod=15 Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.799782 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-operator-scripts\") pod \"keystone-db-create-qswsn\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.799941 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv8vh\" (UniqueName: \"kubernetes.io/projected/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-kube-api-access-zv8vh\") pod \"keystone-db-create-qswsn\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.822324 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3140-account-create-update-z8jqc"] Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.823993 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3140-account-create-update-z8jqc"] Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.824103 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.826408 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.903134 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv8vh\" (UniqueName: \"kubernetes.io/projected/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-kube-api-access-zv8vh\") pod \"keystone-db-create-qswsn\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.903297 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-operator-scripts\") pod \"keystone-db-create-qswsn\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.903745 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f7h4\" (UniqueName: \"kubernetes.io/projected/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-kube-api-access-7f7h4\") pod \"keystone-3140-account-create-update-z8jqc\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.903818 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-operator-scripts\") pod \"keystone-3140-account-create-update-z8jqc\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.904578 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-operator-scripts\") pod \"keystone-db-create-qswsn\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.924312 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv8vh\" (UniqueName: \"kubernetes.io/projected/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-kube-api-access-zv8vh\") pod \"keystone-db-create-qswsn\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.935411 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59b7b6c866-l2c75_090140ec-1e0c-43b4-b71e-bbe2f9d45281/console/0.log" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.935771 4878 generic.go:334] "Generic (PLEG): container finished" podID="090140ec-1e0c-43b4-b71e-bbe2f9d45281" containerID="4d3a8810192f466bf2c48eb8e750538f1e1c5a7a09a64efdea03ed930276d53d" exitCode=2 Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.935851 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b7b6c866-l2c75" event={"ID":"090140ec-1e0c-43b4-b71e-bbe2f9d45281","Type":"ContainerDied","Data":"4d3a8810192f466bf2c48eb8e750538f1e1c5a7a09a64efdea03ed930276d53d"} Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.982299 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-xzscw" event={"ID":"73844c1f-2d85-4d85-889f-90392030c02d","Type":"ContainerDied","Data":"284aaf711f3977e0b35a1187cdabffbcd420466c8585c468264f32d7570078b9"} Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.982355 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="284aaf711f3977e0b35a1187cdabffbcd420466c8585c468264f32d7570078b9" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.983192 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xzscw" Dec 02 18:36:22 crc kubenswrapper[4878]: I1202 18:36:22.992363 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.031789 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f7h4\" (UniqueName: \"kubernetes.io/projected/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-kube-api-access-7f7h4\") pod \"keystone-3140-account-create-update-z8jqc\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.032699 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-operator-scripts\") pod \"keystone-3140-account-create-update-z8jqc\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.033764 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-operator-scripts\") pod \"keystone-3140-account-create-update-z8jqc\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.059306 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f7h4\" (UniqueName: \"kubernetes.io/projected/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-kube-api-access-7f7h4\") pod \"keystone-3140-account-create-update-z8jqc\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.135424 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73844c1f-2d85-4d85-889f-90392030c02d-operator-scripts\") pod \"73844c1f-2d85-4d85-889f-90392030c02d\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.135688 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqsws\" (UniqueName: \"kubernetes.io/projected/73844c1f-2d85-4d85-889f-90392030c02d-kube-api-access-bqsws\") pod \"73844c1f-2d85-4d85-889f-90392030c02d\" (UID: \"73844c1f-2d85-4d85-889f-90392030c02d\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.138739 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73844c1f-2d85-4d85-889f-90392030c02d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73844c1f-2d85-4d85-889f-90392030c02d" (UID: "73844c1f-2d85-4d85-889f-90392030c02d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.151669 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73844c1f-2d85-4d85-889f-90392030c02d-kube-api-access-bqsws" (OuterVolumeSpecName: "kube-api-access-bqsws") pod "73844c1f-2d85-4d85-889f-90392030c02d" (UID: "73844c1f-2d85-4d85-889f-90392030c02d"). InnerVolumeSpecName "kube-api-access-bqsws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.240857 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73844c1f-2d85-4d85-889f-90392030c02d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.240900 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqsws\" (UniqueName: \"kubernetes.io/projected/73844c1f-2d85-4d85-889f-90392030c02d-kube-api-access-bqsws\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.275029 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dkzxj"] Dec 02 18:36:23 crc kubenswrapper[4878]: E1202 18:36:23.275697 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73844c1f-2d85-4d85-889f-90392030c02d" containerName="mariadb-database-create" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.275713 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="73844c1f-2d85-4d85-889f-90392030c02d" containerName="mariadb-database-create" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.275934 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="73844c1f-2d85-4d85-889f-90392030c02d" containerName="mariadb-database-create" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.276891 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.288972 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.317091 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dkzxj"] Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.351713 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59b7b6c866-l2c75_090140ec-1e0c-43b4-b71e-bbe2f9d45281/console/0.log" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.351816 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.434843 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9766-account-create-update-cs8kz"] Dec 02 18:36:23 crc kubenswrapper[4878]: E1202 18:36:23.435547 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090140ec-1e0c-43b4-b71e-bbe2f9d45281" containerName="console" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.435570 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="090140ec-1e0c-43b4-b71e-bbe2f9d45281" containerName="console" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.435792 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="090140ec-1e0c-43b4-b71e-bbe2f9d45281" containerName="console" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.436789 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.443819 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.446392 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9766-account-create-update-cs8kz"] Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.452177 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-oauth-config\") pod \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.452267 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-serving-cert\") pod \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.452335 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-service-ca\") pod \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.452367 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq9v8\" (UniqueName: \"kubernetes.io/projected/090140ec-1e0c-43b4-b71e-bbe2f9d45281-kube-api-access-dq9v8\") pod \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.452497 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-oauth-serving-cert\") pod \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.452522 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-trusted-ca-bundle\") pod \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.452600 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-config\") pod \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\" (UID: \"090140ec-1e0c-43b4-b71e-bbe2f9d45281\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.453032 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l9vd\" (UniqueName: \"kubernetes.io/projected/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-kube-api-access-4l9vd\") pod \"placement-db-create-dkzxj\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.453153 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-operator-scripts\") pod \"placement-db-create-dkzxj\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.455345 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-service-ca" (OuterVolumeSpecName: "service-ca") pod "090140ec-1e0c-43b4-b71e-bbe2f9d45281" (UID: "090140ec-1e0c-43b4-b71e-bbe2f9d45281"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.457415 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "090140ec-1e0c-43b4-b71e-bbe2f9d45281" (UID: "090140ec-1e0c-43b4-b71e-bbe2f9d45281"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.457974 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "090140ec-1e0c-43b4-b71e-bbe2f9d45281" (UID: "090140ec-1e0c-43b4-b71e-bbe2f9d45281"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.460318 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090140ec-1e0c-43b4-b71e-bbe2f9d45281-kube-api-access-dq9v8" (OuterVolumeSpecName: "kube-api-access-dq9v8") pod "090140ec-1e0c-43b4-b71e-bbe2f9d45281" (UID: "090140ec-1e0c-43b4-b71e-bbe2f9d45281"). InnerVolumeSpecName "kube-api-access-dq9v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.466528 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-config" (OuterVolumeSpecName: "console-config") pod "090140ec-1e0c-43b4-b71e-bbe2f9d45281" (UID: "090140ec-1e0c-43b4-b71e-bbe2f9d45281"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.472441 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "090140ec-1e0c-43b4-b71e-bbe2f9d45281" (UID: "090140ec-1e0c-43b4-b71e-bbe2f9d45281"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.484952 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "090140ec-1e0c-43b4-b71e-bbe2f9d45281" (UID: "090140ec-1e0c-43b4-b71e-bbe2f9d45281"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.563212 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz8zc\" (UniqueName: \"kubernetes.io/projected/84a2e351-13b3-47a9-b084-b0aa69b245ca-kube-api-access-tz8zc\") pod \"placement-9766-account-create-update-cs8kz\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.563417 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l9vd\" (UniqueName: \"kubernetes.io/projected/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-kube-api-access-4l9vd\") pod \"placement-db-create-dkzxj\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.563939 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-operator-scripts\") pod \"placement-db-create-dkzxj\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564002 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84a2e351-13b3-47a9-b084-b0aa69b245ca-operator-scripts\") pod \"placement-9766-account-create-update-cs8kz\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564320 4878 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564343 4878 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564390 4878 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/090140ec-1e0c-43b4-b71e-bbe2f9d45281-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564400 4878 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564410 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq9v8\" (UniqueName: \"kubernetes.io/projected/090140ec-1e0c-43b4-b71e-bbe2f9d45281-kube-api-access-dq9v8\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564422 4878 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.564431 4878 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/090140ec-1e0c-43b4-b71e-bbe2f9d45281-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.599318 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-operator-scripts\") pod \"placement-db-create-dkzxj\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.616596 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l9vd\" (UniqueName: \"kubernetes.io/projected/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-kube-api-access-4l9vd\") pod \"placement-db-create-dkzxj\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.661398 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.673767 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz8zc\" (UniqueName: \"kubernetes.io/projected/84a2e351-13b3-47a9-b084-b0aa69b245ca-kube-api-access-tz8zc\") pod \"placement-9766-account-create-update-cs8kz\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.673942 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84a2e351-13b3-47a9-b084-b0aa69b245ca-operator-scripts\") pod \"placement-9766-account-create-update-cs8kz\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.674697 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84a2e351-13b3-47a9-b084-b0aa69b245ca-operator-scripts\") pod \"placement-9766-account-create-update-cs8kz\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.709871 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz8zc\" (UniqueName: \"kubernetes.io/projected/84a2e351-13b3-47a9-b084-b0aa69b245ca-kube-api-access-tz8zc\") pod \"placement-9766-account-create-update-cs8kz\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.775203 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4f4r\" (UniqueName: \"kubernetes.io/projected/41728d08-e232-4713-ba93-162894335f4c-kube-api-access-c4f4r\") pod \"41728d08-e232-4713-ba93-162894335f4c\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.775558 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41728d08-e232-4713-ba93-162894335f4c-operator-scripts\") pod \"41728d08-e232-4713-ba93-162894335f4c\" (UID: \"41728d08-e232-4713-ba93-162894335f4c\") " Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.776614 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41728d08-e232-4713-ba93-162894335f4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41728d08-e232-4713-ba93-162894335f4c" (UID: "41728d08-e232-4713-ba93-162894335f4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.780057 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41728d08-e232-4713-ba93-162894335f4c-kube-api-access-c4f4r" (OuterVolumeSpecName: "kube-api-access-c4f4r") pod "41728d08-e232-4713-ba93-162894335f4c" (UID: "41728d08-e232-4713-ba93-162894335f4c"). InnerVolumeSpecName "kube-api-access-c4f4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.801526 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.819958 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qswsn"] Dec 02 18:36:23 crc kubenswrapper[4878]: W1202 18:36:23.860512 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3ed7121_1f6d_4f72_81a7_a25c9f98304b.slice/crio-d3acb6e405401e3b6160752e888b5de6634b4905c042bdbf22718ef3ee921bfd WatchSource:0}: Error finding container d3acb6e405401e3b6160752e888b5de6634b4905c042bdbf22718ef3ee921bfd: Status 404 returned error can't find the container with id d3acb6e405401e3b6160752e888b5de6634b4905c042bdbf22718ef3ee921bfd Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.879087 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41728d08-e232-4713-ba93-162894335f4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.879132 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4f4r\" (UniqueName: \"kubernetes.io/projected/41728d08-e232-4713-ba93-162894335f4c-kube-api-access-c4f4r\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.916459 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.965105 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59b7b6c866-l2c75_090140ec-1e0c-43b4-b71e-bbe2f9d45281/console/0.log" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.965293 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b7b6c866-l2c75" event={"ID":"090140ec-1e0c-43b4-b71e-bbe2f9d45281","Type":"ContainerDied","Data":"53e5f4dd68fda8971d31ca72683f6ea25b216f0df5f1d15216f5179f6d51965a"} Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.965309 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b7b6c866-l2c75" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.965367 4878 scope.go:117] "RemoveContainer" containerID="4d3a8810192f466bf2c48eb8e750538f1e1c5a7a09a64efdea03ed930276d53d" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.974663 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qswsn" event={"ID":"f3ed7121-1f6d-4f72-81a7-a25c9f98304b","Type":"ContainerStarted","Data":"d3acb6e405401e3b6160752e888b5de6634b4905c042bdbf22718ef3ee921bfd"} Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.977488 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af53983f-772c-431c-95a1-af6b3d3c0edf","Type":"ContainerStarted","Data":"e07bf5717ad7385b9dda6f7c38345b48b3929cc451e8613fcfa449c677ccf153"} Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.979623 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-xzscw" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.980385 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95c8-account-create-update-szl82" Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.981224 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95c8-account-create-update-szl82" event={"ID":"41728d08-e232-4713-ba93-162894335f4c","Type":"ContainerDied","Data":"3fb3e3a023b0a1f7bb207614cee7a4c9ed5193deb7d9c4c209419ad1474dcf73"} Dec 02 18:36:23 crc kubenswrapper[4878]: I1202 18:36:23.981519 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb3e3a023b0a1f7bb207614cee7a4c9ed5193deb7d9c4c209419ad1474dcf73" Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.033319 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59b7b6c866-l2c75"] Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.048060 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59b7b6c866-l2c75"] Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.060812 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3140-account-create-update-z8jqc"] Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.383053 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9766-account-create-update-cs8kz"] Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.635776 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dkzxj"] Dec 02 18:36:24 crc kubenswrapper[4878]: W1202 18:36:24.650504 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c4a31c7_8336_4fc3_b5f9_7614e75b3c65.slice/crio-6beb24861e1afd17b567908a0663281c13f144f56629cc7f5f7cf0ea6d290927 WatchSource:0}: Error finding container 6beb24861e1afd17b567908a0663281c13f144f56629cc7f5f7cf0ea6d290927: Status 404 returned error can't find the container with id 6beb24861e1afd17b567908a0663281c13f144f56629cc7f5f7cf0ea6d290927 Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.964233 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090140ec-1e0c-43b4-b71e-bbe2f9d45281" path="/var/lib/kubelet/pods/090140ec-1e0c-43b4-b71e-bbe2f9d45281/volumes" Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.994652 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3140-account-create-update-z8jqc" event={"ID":"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c","Type":"ContainerStarted","Data":"57d8e0fbcc18800318007afa266f816efa2832719ca31b15548a9e0773166b82"} Dec 02 18:36:24 crc kubenswrapper[4878]: I1202 18:36:24.994723 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3140-account-create-update-z8jqc" event={"ID":"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c","Type":"ContainerStarted","Data":"87538ce0ce068c5ab54ff017fd4487e737d7bcf9b634738bc18ef464f9bf2658"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.002840 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af53983f-772c-431c-95a1-af6b3d3c0edf","Type":"ContainerStarted","Data":"e0571e248829796941c9e987d2958c33dfb371f0b01b3066470f5e70ec2a271a"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.003657 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.007886 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dkzxj" event={"ID":"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65","Type":"ContainerStarted","Data":"acddf47d8dc17fbf73f5066ed0ee03e768da7e323d63ee095239fea44de8860b"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.007932 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dkzxj" event={"ID":"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65","Type":"ContainerStarted","Data":"6beb24861e1afd17b567908a0663281c13f144f56629cc7f5f7cf0ea6d290927"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.012699 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qswsn" event={"ID":"f3ed7121-1f6d-4f72-81a7-a25c9f98304b","Type":"ContainerStarted","Data":"b078473c94ef910d7a256927181d9df5d9ce59f39541b2a61d3dd35020ce14e8"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.023015 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-3140-account-create-update-z8jqc" podStartSLOduration=3.022995254 podStartE2EDuration="3.022995254s" podCreationTimestamp="2025-12-02 18:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:25.018167505 +0000 UTC m=+1294.707786406" watchObservedRunningTime="2025-12-02 18:36:25.022995254 +0000 UTC m=+1294.712614155" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.025580 4878 generic.go:334] "Generic (PLEG): container finished" podID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerID="f47b77d298db6e299054ba2f8bdad64cb6c8cb2bbfbcbc308213e7b1e3765941" exitCode=0 Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.025681 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"168172d2-5cc8-492f-aa26-bd2a1351cdf2","Type":"ContainerDied","Data":"f47b77d298db6e299054ba2f8bdad64cb6c8cb2bbfbcbc308213e7b1e3765941"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.027998 4878 generic.go:334] "Generic (PLEG): container finished" podID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerID="33bf4bb167f90d4156d2bdcea9dab49b9f0ec8a779f82ec97a18a80fccfefc6a" exitCode=0 Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.028061 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8ce834c-073d-4062-b3ee-488fa79aae4f","Type":"ContainerDied","Data":"33bf4bb167f90d4156d2bdcea9dab49b9f0ec8a779f82ec97a18a80fccfefc6a"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.046764 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9766-account-create-update-cs8kz" event={"ID":"84a2e351-13b3-47a9-b084-b0aa69b245ca","Type":"ContainerStarted","Data":"67c773fa1b2388f25773245fceb5fa2df463d68bd0d343a40c4c7a4214c8fed8"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.046856 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9766-account-create-update-cs8kz" event={"ID":"84a2e351-13b3-47a9-b084-b0aa69b245ca","Type":"ContainerStarted","Data":"47b6f733a3bdad862e234fe8061ce60b4639bec577b7a2872cd7ebdfe8c727fa"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.062871 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3bc2339a-313d-485a-b67f-d18b597c36e5","Type":"ContainerStarted","Data":"bd37ed7af3c342730a3178bc613dab9f0723342a4e89cdeb62d8567806f3f9e2"} Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.065216 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-dkzxj" podStartSLOduration=2.065154032 podStartE2EDuration="2.065154032s" podCreationTimestamp="2025-12-02 18:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:25.036683879 +0000 UTC m=+1294.726302760" watchObservedRunningTime="2025-12-02 18:36:25.065154032 +0000 UTC m=+1294.754772913" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.067075 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.106214 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.509561619 podStartE2EDuration="5.106179905s" podCreationTimestamp="2025-12-02 18:36:20 +0000 UTC" firstStartedPulling="2025-12-02 18:36:21.265207169 +0000 UTC m=+1290.954826050" lastFinishedPulling="2025-12-02 18:36:22.861825455 +0000 UTC m=+1292.551444336" observedRunningTime="2025-12-02 18:36:25.070169888 +0000 UTC m=+1294.759788789" watchObservedRunningTime="2025-12-02 18:36:25.106179905 +0000 UTC m=+1294.795798786" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.117026 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-qswsn" podStartSLOduration=3.117005311 podStartE2EDuration="3.117005311s" podCreationTimestamp="2025-12-02 18:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:25.091152079 +0000 UTC m=+1294.780770950" watchObservedRunningTime="2025-12-02 18:36:25.117005311 +0000 UTC m=+1294.806624192" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.148020 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9766-account-create-update-cs8kz" podStartSLOduration=2.147997513 podStartE2EDuration="2.147997513s" podCreationTimestamp="2025-12-02 18:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:25.109469827 +0000 UTC m=+1294.799088728" watchObservedRunningTime="2025-12-02 18:36:25.147997513 +0000 UTC m=+1294.837616394" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.198783 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=26.541920368 podStartE2EDuration="1m0.198759867s" podCreationTimestamp="2025-12-02 18:35:25 +0000 UTC" firstStartedPulling="2025-12-02 18:35:49.821099967 +0000 UTC m=+1259.510718848" lastFinishedPulling="2025-12-02 18:36:23.477939466 +0000 UTC m=+1293.167558347" observedRunningTime="2025-12-02 18:36:25.191585934 +0000 UTC m=+1294.881204825" watchObservedRunningTime="2025-12-02 18:36:25.198759867 +0000 UTC m=+1294.888378748" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.294041 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-p6mn2"] Dec 02 18:36:25 crc kubenswrapper[4878]: E1202 18:36:25.294585 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41728d08-e232-4713-ba93-162894335f4c" containerName="mariadb-account-create-update" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.294607 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="41728d08-e232-4713-ba93-162894335f4c" containerName="mariadb-account-create-update" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.294834 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="41728d08-e232-4713-ba93-162894335f4c" containerName="mariadb-account-create-update" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.295827 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.317204 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-p6mn2"] Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.333222 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsw4z\" (UniqueName: \"kubernetes.io/projected/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-kube-api-access-bsw4z\") pod \"mysqld-exporter-openstack-db-create-p6mn2\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.334369 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-p6mn2\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.437702 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-p6mn2\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.437891 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsw4z\" (UniqueName: \"kubernetes.io/projected/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-kube-api-access-bsw4z\") pod \"mysqld-exporter-openstack-db-create-p6mn2\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.439145 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-p6mn2\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.460525 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsw4z\" (UniqueName: \"kubernetes.io/projected/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-kube-api-access-bsw4z\") pod \"mysqld-exporter-openstack-db-create-p6mn2\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.540565 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-jdgcc"] Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.542630 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.571255 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jdgcc"] Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.614353 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-397e-account-create-update-88k7b"] Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.616873 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.621385 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.642728 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2nfw\" (UniqueName: \"kubernetes.io/projected/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-kube-api-access-t2nfw\") pod \"mysqld-exporter-397e-account-create-update-88k7b\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.642790 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-config\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.642863 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-dns-svc\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.642885 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.642938 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-operator-scripts\") pod \"mysqld-exporter-397e-account-create-update-88k7b\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.642961 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp82h\" (UniqueName: \"kubernetes.io/projected/d37e6b48-32b3-4471-aa2e-4893d1a7c329-kube-api-access-sp82h\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.643029 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.647642 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-397e-account-create-update-88k7b"] Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.652426 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.744806 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-dns-svc\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.744853 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.744913 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-operator-scripts\") pod \"mysqld-exporter-397e-account-create-update-88k7b\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.744935 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp82h\" (UniqueName: \"kubernetes.io/projected/d37e6b48-32b3-4471-aa2e-4893d1a7c329-kube-api-access-sp82h\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.745013 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.745061 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2nfw\" (UniqueName: \"kubernetes.io/projected/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-kube-api-access-t2nfw\") pod \"mysqld-exporter-397e-account-create-update-88k7b\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.745088 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-config\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.745801 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-operator-scripts\") pod \"mysqld-exporter-397e-account-create-update-88k7b\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.745879 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.746063 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.746327 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-config\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.746903 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-dns-svc\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.770673 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp82h\" (UniqueName: \"kubernetes.io/projected/d37e6b48-32b3-4471-aa2e-4893d1a7c329-kube-api-access-sp82h\") pod \"dnsmasq-dns-698758b865-jdgcc\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.790550 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2nfw\" (UniqueName: \"kubernetes.io/projected/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-kube-api-access-t2nfw\") pod \"mysqld-exporter-397e-account-create-update-88k7b\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.883126 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:25 crc kubenswrapper[4878]: I1202 18:36:25.968516 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.082124 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8ce834c-073d-4062-b3ee-488fa79aae4f","Type":"ContainerStarted","Data":"ef6402595a754f87415db1eafef9e466691a9112edf2bc427e42ae1c310ef752"} Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.083528 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.095587 4878 generic.go:334] "Generic (PLEG): container finished" podID="1c4a31c7-8336-4fc3-b5f9-7614e75b3c65" containerID="acddf47d8dc17fbf73f5066ed0ee03e768da7e323d63ee095239fea44de8860b" exitCode=0 Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.095737 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dkzxj" event={"ID":"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65","Type":"ContainerDied","Data":"acddf47d8dc17fbf73f5066ed0ee03e768da7e323d63ee095239fea44de8860b"} Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.108424 4878 generic.go:334] "Generic (PLEG): container finished" podID="84a2e351-13b3-47a9-b084-b0aa69b245ca" containerID="67c773fa1b2388f25773245fceb5fa2df463d68bd0d343a40c4c7a4214c8fed8" exitCode=0 Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.108992 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9766-account-create-update-cs8kz" event={"ID":"84a2e351-13b3-47a9-b084-b0aa69b245ca","Type":"ContainerDied","Data":"67c773fa1b2388f25773245fceb5fa2df463d68bd0d343a40c4c7a4214c8fed8"} Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.119040 4878 generic.go:334] "Generic (PLEG): container finished" podID="f3ed7121-1f6d-4f72-81a7-a25c9f98304b" containerID="b078473c94ef910d7a256927181d9df5d9ce59f39541b2a61d3dd35020ce14e8" exitCode=0 Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.119134 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qswsn" event={"ID":"f3ed7121-1f6d-4f72-81a7-a25c9f98304b","Type":"ContainerDied","Data":"b078473c94ef910d7a256927181d9df5d9ce59f39541b2a61d3dd35020ce14e8"} Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.150917 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.866746843 podStartE2EDuration="1m8.150893881s" podCreationTimestamp="2025-12-02 18:35:18 +0000 UTC" firstStartedPulling="2025-12-02 18:35:21.154483866 +0000 UTC m=+1230.844102737" lastFinishedPulling="2025-12-02 18:35:48.438630874 +0000 UTC m=+1258.128249775" observedRunningTime="2025-12-02 18:36:26.127531227 +0000 UTC m=+1295.817150108" watchObservedRunningTime="2025-12-02 18:36:26.150893881 +0000 UTC m=+1295.840512762" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.154614 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"168172d2-5cc8-492f-aa26-bd2a1351cdf2","Type":"ContainerStarted","Data":"b33c5557685167291d2cada300c01f364fcc4cc0cdd87196cad4fdf4c3468440"} Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.155048 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.173757 4878 generic.go:334] "Generic (PLEG): container finished" podID="18318a2e-0ad8-49cc-bb0b-30a5fbc7604c" containerID="57d8e0fbcc18800318007afa266f816efa2832719ca31b15548a9e0773166b82" exitCode=0 Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.173983 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3140-account-create-update-z8jqc" event={"ID":"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c","Type":"ContainerDied","Data":"57d8e0fbcc18800318007afa266f816efa2832719ca31b15548a9e0773166b82"} Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.311408 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.087008588 podStartE2EDuration="1m8.3113845s" podCreationTimestamp="2025-12-02 18:35:18 +0000 UTC" firstStartedPulling="2025-12-02 18:35:20.793173529 +0000 UTC m=+1230.482792400" lastFinishedPulling="2025-12-02 18:35:49.017549431 +0000 UTC m=+1258.707168312" observedRunningTime="2025-12-02 18:36:26.300146301 +0000 UTC m=+1295.989765192" watchObservedRunningTime="2025-12-02 18:36:26.3113845 +0000 UTC m=+1296.001003381" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.371039 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-p6mn2"] Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.696918 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jdgcc"] Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.754969 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.946121 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-397e-account-create-update-88k7b"] Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.948263 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.961160 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.961510 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.961643 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 18:36:26 crc kubenswrapper[4878]: I1202 18:36:26.961879 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-fj6nk" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.044599 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.165929 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-cache\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.167104 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58r9b\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-kube-api-access-58r9b\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.167484 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.167590 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.168318 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-lock\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.192271 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" event={"ID":"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8","Type":"ContainerStarted","Data":"6d886665ccee24f88e4d7f8ae19ace5a40b439f8b69646ffc6f0b690af92e9e7"} Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.201421 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jdgcc" event={"ID":"d37e6b48-32b3-4471-aa2e-4893d1a7c329","Type":"ContainerStarted","Data":"6336d50995211c2bf83cd1a43232bc8d16b7bd8dd32db6a31a318edd60d42961"} Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.221417 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" event={"ID":"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae","Type":"ContainerStarted","Data":"8b63053f335504bdf190c08d78bc636147acbca4ef391d17ac424c355cf31238"} Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.221465 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" event={"ID":"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae","Type":"ContainerStarted","Data":"dc43c2aecafecfae30fe863d333ca18a5536ebad4e8e611b7567736e1e20a441"} Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.273373 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-lock\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.273520 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-cache\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.273558 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58r9b\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-kube-api-access-58r9b\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.273586 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.273617 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: E1202 18:36:27.273945 4878 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 18:36:27 crc kubenswrapper[4878]: E1202 18:36:27.273965 4878 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 18:36:27 crc kubenswrapper[4878]: E1202 18:36:27.274061 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift podName:5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f nodeName:}" failed. No retries permitted until 2025-12-02 18:36:27.77400525 +0000 UTC m=+1297.463624131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift") pod "swift-storage-0" (UID: "5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f") : configmap "swift-ring-files" not found Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.274718 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-cache\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.274873 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.276263 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-lock\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.325556 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58r9b\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-kube-api-access-58r9b\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.370256 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" podStartSLOduration=2.370211575 podStartE2EDuration="2.370211575s" podCreationTimestamp="2025-12-02 18:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:27.290783361 +0000 UTC m=+1296.980402242" watchObservedRunningTime="2025-12-02 18:36:27.370211575 +0000 UTC m=+1297.059830466" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.501072 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: I1202 18:36:27.786101 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:27 crc kubenswrapper[4878]: E1202 18:36:27.786386 4878 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 18:36:27 crc kubenswrapper[4878]: E1202 18:36:27.786689 4878 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 18:36:27 crc kubenswrapper[4878]: E1202 18:36:27.786848 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift podName:5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f nodeName:}" failed. No retries permitted until 2025-12-02 18:36:28.786823058 +0000 UTC m=+1298.476441939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift") pod "swift-storage-0" (UID: "5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f") : configmap "swift-ring-files" not found Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.074865 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.088003 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.196791 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f7h4\" (UniqueName: \"kubernetes.io/projected/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-kube-api-access-7f7h4\") pod \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.197158 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-operator-scripts\") pod \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\" (UID: \"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.197186 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84a2e351-13b3-47a9-b084-b0aa69b245ca-operator-scripts\") pod \"84a2e351-13b3-47a9-b084-b0aa69b245ca\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.197324 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz8zc\" (UniqueName: \"kubernetes.io/projected/84a2e351-13b3-47a9-b084-b0aa69b245ca-kube-api-access-tz8zc\") pod \"84a2e351-13b3-47a9-b084-b0aa69b245ca\" (UID: \"84a2e351-13b3-47a9-b084-b0aa69b245ca\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.198468 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18318a2e-0ad8-49cc-bb0b-30a5fbc7604c" (UID: "18318a2e-0ad8-49cc-bb0b-30a5fbc7604c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.199047 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84a2e351-13b3-47a9-b084-b0aa69b245ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84a2e351-13b3-47a9-b084-b0aa69b245ca" (UID: "84a2e351-13b3-47a9-b084-b0aa69b245ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.211693 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a2e351-13b3-47a9-b084-b0aa69b245ca-kube-api-access-tz8zc" (OuterVolumeSpecName: "kube-api-access-tz8zc") pod "84a2e351-13b3-47a9-b084-b0aa69b245ca" (UID: "84a2e351-13b3-47a9-b084-b0aa69b245ca"). InnerVolumeSpecName "kube-api-access-tz8zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.212687 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-kube-api-access-7f7h4" (OuterVolumeSpecName: "kube-api-access-7f7h4") pod "18318a2e-0ad8-49cc-bb0b-30a5fbc7604c" (UID: "18318a2e-0ad8-49cc-bb0b-30a5fbc7604c"). InnerVolumeSpecName "kube-api-access-7f7h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.257236 4878 generic.go:334] "Generic (PLEG): container finished" podID="f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8" containerID="5b75f46b42bf9e76396fc3715b5ecce3c4dc5aff981504b479d7ee504e022a2a" exitCode=0 Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.258833 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" event={"ID":"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8","Type":"ContainerDied","Data":"5b75f46b42bf9e76396fc3715b5ecce3c4dc5aff981504b479d7ee504e022a2a"} Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.261432 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.286647 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9766-account-create-update-cs8kz" event={"ID":"84a2e351-13b3-47a9-b084-b0aa69b245ca","Type":"ContainerDied","Data":"47b6f733a3bdad862e234fe8061ce60b4639bec577b7a2872cd7ebdfe8c727fa"} Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.286956 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b6f733a3bdad862e234fe8061ce60b4639bec577b7a2872cd7ebdfe8c727fa" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.287098 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9766-account-create-update-cs8kz" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.302621 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84a2e351-13b3-47a9-b084-b0aa69b245ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.302790 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.302811 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz8zc\" (UniqueName: \"kubernetes.io/projected/84a2e351-13b3-47a9-b084-b0aa69b245ca-kube-api-access-tz8zc\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.302821 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f7h4\" (UniqueName: \"kubernetes.io/projected/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c-kube-api-access-7f7h4\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.315921 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qswsn" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.316618 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qswsn" event={"ID":"f3ed7121-1f6d-4f72-81a7-a25c9f98304b","Type":"ContainerDied","Data":"d3acb6e405401e3b6160752e888b5de6634b4905c042bdbf22718ef3ee921bfd"} Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.316670 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3acb6e405401e3b6160752e888b5de6634b4905c042bdbf22718ef3ee921bfd" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.324225 4878 generic.go:334] "Generic (PLEG): container finished" podID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerID="3b563374a340d5afa7894020a9e8459ec6324fd14c0e17d0b2d8285115781d89" exitCode=0 Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.324380 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jdgcc" event={"ID":"d37e6b48-32b3-4471-aa2e-4893d1a7c329","Type":"ContainerDied","Data":"3b563374a340d5afa7894020a9e8459ec6324fd14c0e17d0b2d8285115781d89"} Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.338828 4878 generic.go:334] "Generic (PLEG): container finished" podID="aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae" containerID="8b63053f335504bdf190c08d78bc636147acbca4ef391d17ac424c355cf31238" exitCode=0 Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.338983 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" event={"ID":"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae","Type":"ContainerDied","Data":"8b63053f335504bdf190c08d78bc636147acbca4ef391d17ac424c355cf31238"} Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.344653 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3140-account-create-update-z8jqc" event={"ID":"18318a2e-0ad8-49cc-bb0b-30a5fbc7604c","Type":"ContainerDied","Data":"87538ce0ce068c5ab54ff017fd4487e737d7bcf9b634738bc18ef464f9bf2658"} Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.344701 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87538ce0ce068c5ab54ff017fd4487e737d7bcf9b634738bc18ef464f9bf2658" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.344778 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3140-account-create-update-z8jqc" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.404535 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv8vh\" (UniqueName: \"kubernetes.io/projected/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-kube-api-access-zv8vh\") pod \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.404905 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-operator-scripts\") pod \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\" (UID: \"f3ed7121-1f6d-4f72-81a7-a25c9f98304b\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.406909 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3ed7121-1f6d-4f72-81a7-a25c9f98304b" (UID: "f3ed7121-1f6d-4f72-81a7-a25c9f98304b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.410862 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-kube-api-access-zv8vh" (OuterVolumeSpecName: "kube-api-access-zv8vh") pod "f3ed7121-1f6d-4f72-81a7-a25c9f98304b" (UID: "f3ed7121-1f6d-4f72-81a7-a25c9f98304b"). InnerVolumeSpecName "kube-api-access-zv8vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.445879 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.510707 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.510752 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv8vh\" (UniqueName: \"kubernetes.io/projected/f3ed7121-1f6d-4f72-81a7-a25c9f98304b-kube-api-access-zv8vh\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.612742 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-operator-scripts\") pod \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.612956 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l9vd\" (UniqueName: \"kubernetes.io/projected/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-kube-api-access-4l9vd\") pod \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\" (UID: \"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65\") " Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.613359 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c4a31c7-8336-4fc3-b5f9-7614e75b3c65" (UID: "1c4a31c7-8336-4fc3-b5f9-7614e75b3c65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.614739 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.617601 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-kube-api-access-4l9vd" (OuterVolumeSpecName: "kube-api-access-4l9vd") pod "1c4a31c7-8336-4fc3-b5f9-7614e75b3c65" (UID: "1c4a31c7-8336-4fc3-b5f9-7614e75b3c65"). InnerVolumeSpecName "kube-api-access-4l9vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.717357 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l9vd\" (UniqueName: \"kubernetes.io/projected/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65-kube-api-access-4l9vd\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:28 crc kubenswrapper[4878]: I1202 18:36:28.819394 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:28 crc kubenswrapper[4878]: E1202 18:36:28.819614 4878 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 18:36:28 crc kubenswrapper[4878]: E1202 18:36:28.819649 4878 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 18:36:28 crc kubenswrapper[4878]: E1202 18:36:28.819718 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift podName:5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f nodeName:}" failed. No retries permitted until 2025-12-02 18:36:30.819699177 +0000 UTC m=+1300.509318058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift") pod "swift-storage-0" (UID: "5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f") : configmap "swift-ring-files" not found Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.122347 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wkmpm"] Dec 02 18:36:29 crc kubenswrapper[4878]: E1202 18:36:29.123231 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18318a2e-0ad8-49cc-bb0b-30a5fbc7604c" containerName="mariadb-account-create-update" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.123297 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="18318a2e-0ad8-49cc-bb0b-30a5fbc7604c" containerName="mariadb-account-create-update" Dec 02 18:36:29 crc kubenswrapper[4878]: E1202 18:36:29.123400 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4a31c7-8336-4fc3-b5f9-7614e75b3c65" containerName="mariadb-database-create" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.123421 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4a31c7-8336-4fc3-b5f9-7614e75b3c65" containerName="mariadb-database-create" Dec 02 18:36:29 crc kubenswrapper[4878]: E1202 18:36:29.123490 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a2e351-13b3-47a9-b084-b0aa69b245ca" containerName="mariadb-account-create-update" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.123507 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a2e351-13b3-47a9-b084-b0aa69b245ca" containerName="mariadb-account-create-update" Dec 02 18:36:29 crc kubenswrapper[4878]: E1202 18:36:29.123588 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ed7121-1f6d-4f72-81a7-a25c9f98304b" containerName="mariadb-database-create" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.123621 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ed7121-1f6d-4f72-81a7-a25c9f98304b" containerName="mariadb-database-create" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.124290 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="18318a2e-0ad8-49cc-bb0b-30a5fbc7604c" containerName="mariadb-account-create-update" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.124351 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a2e351-13b3-47a9-b084-b0aa69b245ca" containerName="mariadb-account-create-update" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.124368 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ed7121-1f6d-4f72-81a7-a25c9f98304b" containerName="mariadb-database-create" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.124385 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4a31c7-8336-4fc3-b5f9-7614e75b3c65" containerName="mariadb-database-create" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.125984 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.128817 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.128828 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-h4ktf" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.168660 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wkmpm"] Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.231537 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-combined-ca-bundle\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.236760 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-db-sync-config-data\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.237045 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8twdm\" (UniqueName: \"kubernetes.io/projected/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-kube-api-access-8twdm\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.237297 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-config-data\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.340134 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-config-data\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.340320 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-combined-ca-bundle\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.340402 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-db-sync-config-data\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.340465 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8twdm\" (UniqueName: \"kubernetes.io/projected/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-kube-api-access-8twdm\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.352862 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-combined-ca-bundle\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.352977 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-db-sync-config-data\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.360093 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dkzxj" event={"ID":"1c4a31c7-8336-4fc3-b5f9-7614e75b3c65","Type":"ContainerDied","Data":"6beb24861e1afd17b567908a0663281c13f144f56629cc7f5f7cf0ea6d290927"} Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.360178 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6beb24861e1afd17b567908a0663281c13f144f56629cc7f5f7cf0ea6d290927" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.360281 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dkzxj" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.364881 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-config-data\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.366512 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8twdm\" (UniqueName: \"kubernetes.io/projected/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-kube-api-access-8twdm\") pod \"glance-db-sync-wkmpm\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.374630 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jdgcc" event={"ID":"d37e6b48-32b3-4471-aa2e-4893d1a7c329","Type":"ContainerStarted","Data":"2055b89dc9cbea43acc6ba991b937ee221b33c2ada58b5a3e0dc3d18f0d16827"} Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.374739 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.437724 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-jdgcc" podStartSLOduration=4.437695847 podStartE2EDuration="4.437695847s" podCreationTimestamp="2025-12-02 18:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:29.42487924 +0000 UTC m=+1299.114498121" watchObservedRunningTime="2025-12-02 18:36:29.437695847 +0000 UTC m=+1299.127314728" Dec 02 18:36:29 crc kubenswrapper[4878]: I1202 18:36:29.474900 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wkmpm" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.033709 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.183115 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsw4z\" (UniqueName: \"kubernetes.io/projected/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-kube-api-access-bsw4z\") pod \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.183214 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-operator-scripts\") pod \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\" (UID: \"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae\") " Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.184184 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae" (UID: "aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.193179 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-kube-api-access-bsw4z" (OuterVolumeSpecName: "kube-api-access-bsw4z") pod "aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae" (UID: "aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae"). InnerVolumeSpecName "kube-api-access-bsw4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.286081 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsw4z\" (UniqueName: \"kubernetes.io/projected/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-kube-api-access-bsw4z\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.286124 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.394561 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" event={"ID":"aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae","Type":"ContainerDied","Data":"dc43c2aecafecfae30fe863d333ca18a5536ebad4e8e611b7567736e1e20a441"} Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.394640 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc43c2aecafecfae30fe863d333ca18a5536ebad4e8e611b7567736e1e20a441" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.394599 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-p6mn2" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.525408 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wkmpm"] Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.563683 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.581539 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.697207 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2nfw\" (UniqueName: \"kubernetes.io/projected/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-kube-api-access-t2nfw\") pod \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.697511 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-operator-scripts\") pod \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\" (UID: \"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8\") " Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.699552 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8" (UID: "f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.711665 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-kube-api-access-t2nfw" (OuterVolumeSpecName: "kube-api-access-t2nfw") pod "f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8" (UID: "f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8"). InnerVolumeSpecName "kube-api-access-t2nfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.803054 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.803091 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2nfw\" (UniqueName: \"kubernetes.io/projected/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8-kube-api-access-t2nfw\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.803936 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rfcql"] Dec 02 18:36:30 crc kubenswrapper[4878]: E1202 18:36:30.804587 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8" containerName="mariadb-account-create-update" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.804613 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8" containerName="mariadb-account-create-update" Dec 02 18:36:30 crc kubenswrapper[4878]: E1202 18:36:30.804636 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae" containerName="mariadb-database-create" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.804644 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae" containerName="mariadb-database-create" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.804924 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8" containerName="mariadb-account-create-update" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.804965 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae" containerName="mariadb-database-create" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.812549 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.817799 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.817952 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.818325 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.828710 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rfcql"] Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.904643 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-combined-ca-bundle\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.904784 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-etc-swift\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.904913 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-swiftconf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.904968 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-ring-data-devices\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.905836 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-scripts\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.905934 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbvbf\" (UniqueName: \"kubernetes.io/projected/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-kube-api-access-nbvbf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.906061 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:30 crc kubenswrapper[4878]: E1202 18:36:30.906390 4878 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 18:36:30 crc kubenswrapper[4878]: E1202 18:36:30.906484 4878 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 18:36:30 crc kubenswrapper[4878]: I1202 18:36:30.906530 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-dispersionconf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:30 crc kubenswrapper[4878]: E1202 18:36:30.906658 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift podName:5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f nodeName:}" failed. No retries permitted until 2025-12-02 18:36:34.906631914 +0000 UTC m=+1304.596250875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift") pod "swift-storage-0" (UID: "5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f") : configmap "swift-ring-files" not found Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.008589 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-dispersionconf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.009003 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-combined-ca-bundle\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.009054 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-etc-swift\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.009161 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-swiftconf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.009266 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-ring-data-devices\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.009368 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-scripts\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.009415 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbvbf\" (UniqueName: \"kubernetes.io/projected/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-kube-api-access-nbvbf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.010146 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-etc-swift\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.010345 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-ring-data-devices\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.010586 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-scripts\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.016207 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-swiftconf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.017627 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-combined-ca-bundle\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.021331 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-dispersionconf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.030991 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbvbf\" (UniqueName: \"kubernetes.io/projected/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-kube-api-access-nbvbf\") pod \"swift-ring-rebalance-rfcql\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.149407 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.448504 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" event={"ID":"f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8","Type":"ContainerDied","Data":"6d886665ccee24f88e4d7f8ae19ace5a40b439f8b69646ffc6f0b690af92e9e7"} Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.448568 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d886665ccee24f88e4d7f8ae19ace5a40b439f8b69646ffc6f0b690af92e9e7" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.448573 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-397e-account-create-update-88k7b" Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.450607 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wkmpm" event={"ID":"9c0ade7f-399e-4b2f-a250-5f6a47e90baf","Type":"ContainerStarted","Data":"51d1f619ed5d20c72019b5bc1564fa59e29ca01859f42592248b0c02ac6ec962"} Dec 02 18:36:31 crc kubenswrapper[4878]: I1202 18:36:31.784425 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rfcql"] Dec 02 18:36:34 crc kubenswrapper[4878]: I1202 18:36:34.979392 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:34 crc kubenswrapper[4878]: E1202 18:36:34.980529 4878 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 18:36:34 crc kubenswrapper[4878]: E1202 18:36:34.980564 4878 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 18:36:34 crc kubenswrapper[4878]: E1202 18:36:34.980639 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift podName:5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f nodeName:}" failed. No retries permitted until 2025-12-02 18:36:42.980618078 +0000 UTC m=+1312.670236959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift") pod "swift-storage-0" (UID: "5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f") : configmap "swift-ring-files" not found Dec 02 18:36:35 crc kubenswrapper[4878]: I1202 18:36:35.504084 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 18:36:35 crc kubenswrapper[4878]: I1202 18:36:35.851060 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ll544"] Dec 02 18:36:35 crc kubenswrapper[4878]: I1202 18:36:35.852699 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:35 crc kubenswrapper[4878]: I1202 18:36:35.887137 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:36:35 crc kubenswrapper[4878]: I1202 18:36:35.892141 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ll544"] Dec 02 18:36:35 crc kubenswrapper[4878]: I1202 18:36:35.908747 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.008772 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9feb9b1-2723-4414-bf7f-747c0295cb66-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ll544\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.008980 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvjx\" (UniqueName: \"kubernetes.io/projected/b9feb9b1-2723-4414-bf7f-747c0295cb66-kube-api-access-cwvjx\") pod \"mysqld-exporter-openstack-cell1-db-create-ll544\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.015881 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2mkl9"] Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.016175 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" podUID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerName="dnsmasq-dns" containerID="cri-o://46cd5b370e44d655f74e2cd3dcfa59274b3b98e5704238221da3bf0bcb342787" gracePeriod=10 Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.027530 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-e07d-account-create-update-7dnbk"] Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.029533 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.040591 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.084809 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e07d-account-create-update-7dnbk"] Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.111793 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvjx\" (UniqueName: \"kubernetes.io/projected/b9feb9b1-2723-4414-bf7f-747c0295cb66-kube-api-access-cwvjx\") pod \"mysqld-exporter-openstack-cell1-db-create-ll544\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.111942 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9feb9b1-2723-4414-bf7f-747c0295cb66-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ll544\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.114127 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9feb9b1-2723-4414-bf7f-747c0295cb66-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-ll544\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.151084 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvjx\" (UniqueName: \"kubernetes.io/projected/b9feb9b1-2723-4414-bf7f-747c0295cb66-kube-api-access-cwvjx\") pod \"mysqld-exporter-openstack-cell1-db-create-ll544\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.178912 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.214335 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-operator-scripts\") pod \"mysqld-exporter-e07d-account-create-update-7dnbk\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.214491 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdlr8\" (UniqueName: \"kubernetes.io/projected/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-kube-api-access-tdlr8\") pod \"mysqld-exporter-e07d-account-create-update-7dnbk\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.316334 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdlr8\" (UniqueName: \"kubernetes.io/projected/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-kube-api-access-tdlr8\") pod \"mysqld-exporter-e07d-account-create-update-7dnbk\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.316520 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-operator-scripts\") pod \"mysqld-exporter-e07d-account-create-update-7dnbk\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.317388 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-operator-scripts\") pod \"mysqld-exporter-e07d-account-create-update-7dnbk\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.352257 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdlr8\" (UniqueName: \"kubernetes.io/projected/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-kube-api-access-tdlr8\") pod \"mysqld-exporter-e07d-account-create-update-7dnbk\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.379933 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.529548 4878 generic.go:334] "Generic (PLEG): container finished" podID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerID="46cd5b370e44d655f74e2cd3dcfa59274b3b98e5704238221da3bf0bcb342787" exitCode=0 Dec 02 18:36:36 crc kubenswrapper[4878]: I1202 18:36:36.529616 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" event={"ID":"13c6db70-cc96-462a-a27c-496e1041fcbb","Type":"ContainerDied","Data":"46cd5b370e44d655f74e2cd3dcfa59274b3b98e5704238221da3bf0bcb342787"} Dec 02 18:36:37 crc kubenswrapper[4878]: W1202 18:36:37.439531 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e84c0f0_1e32_4c9b_b21d_f49bb06863fc.slice/crio-3a8a8fe6ebdd838ec2dc279a23bc3a5c2cb95eac276ded8b6de1b92d3cce2c17 WatchSource:0}: Error finding container 3a8a8fe6ebdd838ec2dc279a23bc3a5c2cb95eac276ded8b6de1b92d3cce2c17: Status 404 returned error can't find the container with id 3a8a8fe6ebdd838ec2dc279a23bc3a5c2cb95eac276ded8b6de1b92d3cce2c17 Dec 02 18:36:37 crc kubenswrapper[4878]: I1202 18:36:37.567383 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rfcql" event={"ID":"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc","Type":"ContainerStarted","Data":"3a8a8fe6ebdd838ec2dc279a23bc3a5c2cb95eac276ded8b6de1b92d3cce2c17"} Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.029982 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.163585 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-nb\") pod \"13c6db70-cc96-462a-a27c-496e1041fcbb\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.163985 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-config\") pod \"13c6db70-cc96-462a-a27c-496e1041fcbb\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.164077 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2zcj\" (UniqueName: \"kubernetes.io/projected/13c6db70-cc96-462a-a27c-496e1041fcbb-kube-api-access-n2zcj\") pod \"13c6db70-cc96-462a-a27c-496e1041fcbb\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.164139 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-dns-svc\") pod \"13c6db70-cc96-462a-a27c-496e1041fcbb\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.164195 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-sb\") pod \"13c6db70-cc96-462a-a27c-496e1041fcbb\" (UID: \"13c6db70-cc96-462a-a27c-496e1041fcbb\") " Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.172062 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c6db70-cc96-462a-a27c-496e1041fcbb-kube-api-access-n2zcj" (OuterVolumeSpecName: "kube-api-access-n2zcj") pod "13c6db70-cc96-462a-a27c-496e1041fcbb" (UID: "13c6db70-cc96-462a-a27c-496e1041fcbb"). InnerVolumeSpecName "kube-api-access-n2zcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.237251 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13c6db70-cc96-462a-a27c-496e1041fcbb" (UID: "13c6db70-cc96-462a-a27c-496e1041fcbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.267889 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2zcj\" (UniqueName: \"kubernetes.io/projected/13c6db70-cc96-462a-a27c-496e1041fcbb-kube-api-access-n2zcj\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.267951 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.269687 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ll544"] Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.291130 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-config" (OuterVolumeSpecName: "config") pod "13c6db70-cc96-462a-a27c-496e1041fcbb" (UID: "13c6db70-cc96-462a-a27c-496e1041fcbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.299886 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13c6db70-cc96-462a-a27c-496e1041fcbb" (UID: "13c6db70-cc96-462a-a27c-496e1041fcbb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.303956 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13c6db70-cc96-462a-a27c-496e1041fcbb" (UID: "13c6db70-cc96-462a-a27c-496e1041fcbb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.374897 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.375023 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.375036 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13c6db70-cc96-462a-a27c-496e1041fcbb-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.386842 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e07d-account-create-update-7dnbk"] Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.606645 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" event={"ID":"13c6db70-cc96-462a-a27c-496e1041fcbb","Type":"ContainerDied","Data":"c620c3ebefe5d7e8da71bc4ae65301e1fc0da4d04116a446481e9b15da75302d"} Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.606726 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2mkl9" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.607205 4878 scope.go:117] "RemoveContainer" containerID="46cd5b370e44d655f74e2cd3dcfa59274b3b98e5704238221da3bf0bcb342787" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.622356 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" event={"ID":"b9feb9b1-2723-4414-bf7f-747c0295cb66","Type":"ContainerStarted","Data":"2468e5c8bef4009da1bea5b9cbbcf727e811fa274b6ecb002eb85a41ac4bfe88"} Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.624858 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" event={"ID":"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d","Type":"ContainerStarted","Data":"66f24204353dbe3e3c2fffd251715642bb382360138065fa7b81a301580888c7"} Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.628811 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerStarted","Data":"ddf160d59512ed95e7f7a3a1ff431f3e8a032dd8ea01919e40dc655a958d69be"} Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.651425 4878 scope.go:117] "RemoveContainer" containerID="e402933f38a0cc3761a66c5fa54c718c2d30cc83ab77a8637a5ac0337fa6bb44" Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.656166 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2mkl9"] Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.668415 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2mkl9"] Dec 02 18:36:38 crc kubenswrapper[4878]: I1202 18:36:38.955583 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c6db70-cc96-462a-a27c-496e1041fcbb" path="/var/lib/kubelet/pods/13c6db70-cc96-462a-a27c-496e1041fcbb/volumes" Dec 02 18:36:39 crc kubenswrapper[4878]: I1202 18:36:39.669834 4878 generic.go:334] "Generic (PLEG): container finished" podID="b9feb9b1-2723-4414-bf7f-747c0295cb66" containerID="9fd93089944b1a43ccbdceaa1e6df6f656fccf71c48ad56acd5cd46040692b7a" exitCode=0 Dec 02 18:36:39 crc kubenswrapper[4878]: I1202 18:36:39.670027 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" event={"ID":"b9feb9b1-2723-4414-bf7f-747c0295cb66","Type":"ContainerDied","Data":"9fd93089944b1a43ccbdceaa1e6df6f656fccf71c48ad56acd5cd46040692b7a"} Dec 02 18:36:39 crc kubenswrapper[4878]: I1202 18:36:39.676004 4878 generic.go:334] "Generic (PLEG): container finished" podID="b7d3c516-e91a-49fc-b75a-f425cd4ccd5d" containerID="c791e1705ad64a67a903f613f98c4720627ce60d7238181c34c86578c4fca8e8" exitCode=0 Dec 02 18:36:39 crc kubenswrapper[4878]: I1202 18:36:39.676063 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" event={"ID":"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d","Type":"ContainerDied","Data":"c791e1705ad64a67a903f613f98c4720627ce60d7238181c34c86578c4fca8e8"} Dec 02 18:36:39 crc kubenswrapper[4878]: I1202 18:36:39.901687 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:36:40 crc kubenswrapper[4878]: I1202 18:36:40.372450 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 18:36:41 crc kubenswrapper[4878]: I1202 18:36:41.705229 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerStarted","Data":"cccfecfab7b741247d179bf0e549c38f82cfb21ac8a57ef168146ff87c1823b5"} Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.425010 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-srxml"] Dec 02 18:36:42 crc kubenswrapper[4878]: E1202 18:36:42.425570 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerName="dnsmasq-dns" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.425596 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerName="dnsmasq-dns" Dec 02 18:36:42 crc kubenswrapper[4878]: E1202 18:36:42.425622 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerName="init" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.425630 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerName="init" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.425897 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c6db70-cc96-462a-a27c-496e1041fcbb" containerName="dnsmasq-dns" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.426936 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.485700 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-srxml"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.512295 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/391eccae-595f-4859-b88b-933307305613-operator-scripts\") pod \"barbican-db-create-srxml\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.512611 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zl9\" (UniqueName: \"kubernetes.io/projected/391eccae-595f-4859-b88b-933307305613-kube-api-access-b4zl9\") pod \"barbican-db-create-srxml\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.617549 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zl9\" (UniqueName: \"kubernetes.io/projected/391eccae-595f-4859-b88b-933307305613-kube-api-access-b4zl9\") pod \"barbican-db-create-srxml\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.617745 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/391eccae-595f-4859-b88b-933307305613-operator-scripts\") pod \"barbican-db-create-srxml\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.618647 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/391eccae-595f-4859-b88b-933307305613-operator-scripts\") pod \"barbican-db-create-srxml\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.652487 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-7e48-account-create-update-d5z4q"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.654400 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.659189 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zl9\" (UniqueName: \"kubernetes.io/projected/391eccae-595f-4859-b88b-933307305613-kube-api-access-b4zl9\") pod \"barbican-db-create-srxml\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.661742 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.737891 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7e48-account-create-update-d5z4q"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.753896 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-srxml" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.761572 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hldwm"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.763267 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.811127 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hldwm"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.833025 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9669f740-c9d3-4c55-866d-a44471c3aa1c-operator-scripts\") pod \"barbican-7e48-account-create-update-d5z4q\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.833286 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmwr\" (UniqueName: \"kubernetes.io/projected/9669f740-c9d3-4c55-866d-a44471c3aa1c-kube-api-access-rcmwr\") pod \"barbican-7e48-account-create-update-d5z4q\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.836924 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a601-account-create-update-2kkrr"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.838729 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.848909 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.860060 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a601-account-create-update-2kkrr"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.896850 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-6dj4k"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.900035 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.907274 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-6dj4k"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.918443 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-92nsw"] Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.920059 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.923415 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.923736 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.923870 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.924024 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jjqdr" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.938212 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90576306-b86d-4017-84fb-19c00f9630a0-operator-scripts\") pod \"cinder-a601-account-create-update-2kkrr\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.938707 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmwr\" (UniqueName: \"kubernetes.io/projected/9669f740-c9d3-4c55-866d-a44471c3aa1c-kube-api-access-rcmwr\") pod \"barbican-7e48-account-create-update-d5z4q\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.938921 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swkk5\" (UniqueName: \"kubernetes.io/projected/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-kube-api-access-swkk5\") pod \"cinder-db-create-hldwm\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.942952 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55n5h\" (UniqueName: \"kubernetes.io/projected/90576306-b86d-4017-84fb-19c00f9630a0-kube-api-access-55n5h\") pod \"cinder-a601-account-create-update-2kkrr\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.944091 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9669f740-c9d3-4c55-866d-a44471c3aa1c-operator-scripts\") pod \"barbican-7e48-account-create-update-d5z4q\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.944992 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9669f740-c9d3-4c55-866d-a44471c3aa1c-operator-scripts\") pod \"barbican-7e48-account-create-update-d5z4q\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.945346 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-operator-scripts\") pod \"cinder-db-create-hldwm\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.961874 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmwr\" (UniqueName: \"kubernetes.io/projected/9669f740-c9d3-4c55-866d-a44471c3aa1c-kube-api-access-rcmwr\") pod \"barbican-7e48-account-create-update-d5z4q\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:42 crc kubenswrapper[4878]: I1202 18:36:42.963962 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-92nsw"] Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.036757 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ppjq4"] Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.038222 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.051122 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-config-data\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.051207 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.051255 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4gb\" (UniqueName: \"kubernetes.io/projected/56f602d3-e640-4248-b53e-201b3556aa6f-kube-api-access-vr4gb\") pod \"heat-db-create-6dj4k\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.051289 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-combined-ca-bundle\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.051366 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swkk5\" (UniqueName: \"kubernetes.io/projected/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-kube-api-access-swkk5\") pod \"cinder-db-create-hldwm\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.051402 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55n5h\" (UniqueName: \"kubernetes.io/projected/90576306-b86d-4017-84fb-19c00f9630a0-kube-api-access-55n5h\") pod \"cinder-a601-account-create-update-2kkrr\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.051440 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhdk4\" (UniqueName: \"kubernetes.io/projected/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-kube-api-access-rhdk4\") pod \"neutron-db-create-ppjq4\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.052414 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltqn5\" (UniqueName: \"kubernetes.io/projected/e9672009-e2c7-4540-91ea-737ef2418ac1-kube-api-access-ltqn5\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: E1202 18:36:43.052506 4878 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 18:36:43 crc kubenswrapper[4878]: E1202 18:36:43.052523 4878 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 18:36:43 crc kubenswrapper[4878]: E1202 18:36:43.052571 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift podName:5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f nodeName:}" failed. No retries permitted until 2025-12-02 18:36:59.052555957 +0000 UTC m=+1328.742174838 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift") pod "swift-storage-0" (UID: "5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f") : configmap "swift-ring-files" not found Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.053416 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-operator-scripts\") pod \"neutron-db-create-ppjq4\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.053476 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-operator-scripts\") pod \"cinder-db-create-hldwm\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.053573 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90576306-b86d-4017-84fb-19c00f9630a0-operator-scripts\") pod \"cinder-a601-account-create-update-2kkrr\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.053606 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f602d3-e640-4248-b53e-201b3556aa6f-operator-scripts\") pod \"heat-db-create-6dj4k\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.055046 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.056169 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-operator-scripts\") pod \"cinder-db-create-hldwm\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.056910 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90576306-b86d-4017-84fb-19c00f9630a0-operator-scripts\") pod \"cinder-a601-account-create-update-2kkrr\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.076417 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ppjq4"] Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.085918 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-b7e2-account-create-update-5hfz5"] Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.087922 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.090021 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55n5h\" (UniqueName: \"kubernetes.io/projected/90576306-b86d-4017-84fb-19c00f9630a0-kube-api-access-55n5h\") pod \"cinder-a601-account-create-update-2kkrr\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.093805 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.106822 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swkk5\" (UniqueName: \"kubernetes.io/projected/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-kube-api-access-swkk5\") pod \"cinder-db-create-hldwm\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.123662 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b7e2-account-create-update-5hfz5"] Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.155751 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f602d3-e640-4248-b53e-201b3556aa6f-operator-scripts\") pod \"heat-db-create-6dj4k\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.155842 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-config-data\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.155891 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefde4cb-7350-4664-8f73-fabfecc591eb-operator-scripts\") pod \"heat-b7e2-account-create-update-5hfz5\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.155915 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4gb\" (UniqueName: \"kubernetes.io/projected/56f602d3-e640-4248-b53e-201b3556aa6f-kube-api-access-vr4gb\") pod \"heat-db-create-6dj4k\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.155940 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-combined-ca-bundle\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.155971 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhdj\" (UniqueName: \"kubernetes.io/projected/cefde4cb-7350-4664-8f73-fabfecc591eb-kube-api-access-sdhdj\") pod \"heat-b7e2-account-create-update-5hfz5\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.156028 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhdk4\" (UniqueName: \"kubernetes.io/projected/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-kube-api-access-rhdk4\") pod \"neutron-db-create-ppjq4\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.156072 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltqn5\" (UniqueName: \"kubernetes.io/projected/e9672009-e2c7-4540-91ea-737ef2418ac1-kube-api-access-ltqn5\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.156131 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-operator-scripts\") pod \"neutron-db-create-ppjq4\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.157119 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-operator-scripts\") pod \"neutron-db-create-ppjq4\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.157731 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f602d3-e640-4248-b53e-201b3556aa6f-operator-scripts\") pod \"heat-db-create-6dj4k\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.161801 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b33b-account-create-update-5f77r"] Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.162725 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-combined-ca-bundle\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.163452 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.163950 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-config-data\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.167443 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.172095 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.206415 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhdk4\" (UniqueName: \"kubernetes.io/projected/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-kube-api-access-rhdk4\") pod \"neutron-db-create-ppjq4\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.211187 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4gb\" (UniqueName: \"kubernetes.io/projected/56f602d3-e640-4248-b53e-201b3556aa6f-kube-api-access-vr4gb\") pod \"heat-db-create-6dj4k\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.217051 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltqn5\" (UniqueName: \"kubernetes.io/projected/e9672009-e2c7-4540-91ea-737ef2418ac1-kube-api-access-ltqn5\") pod \"keystone-db-sync-92nsw\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.217452 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6dj4k" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.237669 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b33b-account-create-update-5f77r"] Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.242376 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-92nsw" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.259366 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhdj\" (UniqueName: \"kubernetes.io/projected/cefde4cb-7350-4664-8f73-fabfecc591eb-kube-api-access-sdhdj\") pod \"heat-b7e2-account-create-update-5hfz5\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.262790 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng95q\" (UniqueName: \"kubernetes.io/projected/cae0102d-4eba-4ca3-915b-0156d65616fa-kube-api-access-ng95q\") pod \"neutron-b33b-account-create-update-5f77r\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.262942 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae0102d-4eba-4ca3-915b-0156d65616fa-operator-scripts\") pod \"neutron-b33b-account-create-update-5f77r\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.263565 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefde4cb-7350-4664-8f73-fabfecc591eb-operator-scripts\") pod \"heat-b7e2-account-create-update-5hfz5\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.264579 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefde4cb-7350-4664-8f73-fabfecc591eb-operator-scripts\") pod \"heat-b7e2-account-create-update-5hfz5\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.279001 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhdj\" (UniqueName: \"kubernetes.io/projected/cefde4cb-7350-4664-8f73-fabfecc591eb-kube-api-access-sdhdj\") pod \"heat-b7e2-account-create-update-5hfz5\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.309804 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.322530 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.365324 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdlr8\" (UniqueName: \"kubernetes.io/projected/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-kube-api-access-tdlr8\") pod \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.365994 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-operator-scripts\") pod \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\" (UID: \"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d\") " Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.366075 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwvjx\" (UniqueName: \"kubernetes.io/projected/b9feb9b1-2723-4414-bf7f-747c0295cb66-kube-api-access-cwvjx\") pod \"b9feb9b1-2723-4414-bf7f-747c0295cb66\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.366102 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9feb9b1-2723-4414-bf7f-747c0295cb66-operator-scripts\") pod \"b9feb9b1-2723-4414-bf7f-747c0295cb66\" (UID: \"b9feb9b1-2723-4414-bf7f-747c0295cb66\") " Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.366483 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng95q\" (UniqueName: \"kubernetes.io/projected/cae0102d-4eba-4ca3-915b-0156d65616fa-kube-api-access-ng95q\") pod \"neutron-b33b-account-create-update-5f77r\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.366550 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae0102d-4eba-4ca3-915b-0156d65616fa-operator-scripts\") pod \"neutron-b33b-account-create-update-5f77r\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.367568 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9feb9b1-2723-4414-bf7f-747c0295cb66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9feb9b1-2723-4414-bf7f-747c0295cb66" (UID: "b9feb9b1-2723-4414-bf7f-747c0295cb66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.368103 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7d3c516-e91a-49fc-b75a-f425cd4ccd5d" (UID: "b7d3c516-e91a-49fc-b75a-f425cd4ccd5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.370496 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9feb9b1-2723-4414-bf7f-747c0295cb66-kube-api-access-cwvjx" (OuterVolumeSpecName: "kube-api-access-cwvjx") pod "b9feb9b1-2723-4414-bf7f-747c0295cb66" (UID: "b9feb9b1-2723-4414-bf7f-747c0295cb66"). InnerVolumeSpecName "kube-api-access-cwvjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.370776 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-kube-api-access-tdlr8" (OuterVolumeSpecName: "kube-api-access-tdlr8") pod "b7d3c516-e91a-49fc-b75a-f425cd4ccd5d" (UID: "b7d3c516-e91a-49fc-b75a-f425cd4ccd5d"). InnerVolumeSpecName "kube-api-access-tdlr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.372797 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ppjq4" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.375915 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae0102d-4eba-4ca3-915b-0156d65616fa-operator-scripts\") pod \"neutron-b33b-account-create-update-5f77r\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.391064 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng95q\" (UniqueName: \"kubernetes.io/projected/cae0102d-4eba-4ca3-915b-0156d65616fa-kube-api-access-ng95q\") pod \"neutron-b33b-account-create-update-5f77r\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.396573 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hldwm" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.469654 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.469685 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwvjx\" (UniqueName: \"kubernetes.io/projected/b9feb9b1-2723-4414-bf7f-747c0295cb66-kube-api-access-cwvjx\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.469697 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9feb9b1-2723-4414-bf7f-747c0295cb66-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.469709 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdlr8\" (UniqueName: \"kubernetes.io/projected/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d-kube-api-access-tdlr8\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.546383 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.561153 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.751292 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" event={"ID":"b7d3c516-e91a-49fc-b75a-f425cd4ccd5d","Type":"ContainerDied","Data":"66f24204353dbe3e3c2fffd251715642bb382360138065fa7b81a301580888c7"} Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.751418 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f24204353dbe3e3c2fffd251715642bb382360138065fa7b81a301580888c7" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.751532 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e07d-account-create-update-7dnbk" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.754815 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" event={"ID":"b9feb9b1-2723-4414-bf7f-747c0295cb66","Type":"ContainerDied","Data":"2468e5c8bef4009da1bea5b9cbbcf727e811fa274b6ecb002eb85a41ac4bfe88"} Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.754880 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2468e5c8bef4009da1bea5b9cbbcf727e811fa274b6ecb002eb85a41ac4bfe88" Dec 02 18:36:43 crc kubenswrapper[4878]: I1202 18:36:43.754998 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-ll544" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.346315 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qgmlw" podUID="1f049ebe-547b-40a2-8468-932cfc5051ea" containerName="ovn-controller" probeResult="failure" output=< Dec 02 18:36:44 crc kubenswrapper[4878]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 18:36:44 crc kubenswrapper[4878]: > Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.430291 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.434545 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dsnc6" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.675506 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qgmlw-config-p57xc"] Dec 02 18:36:44 crc kubenswrapper[4878]: E1202 18:36:44.676584 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d3c516-e91a-49fc-b75a-f425cd4ccd5d" containerName="mariadb-account-create-update" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.676612 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d3c516-e91a-49fc-b75a-f425cd4ccd5d" containerName="mariadb-account-create-update" Dec 02 18:36:44 crc kubenswrapper[4878]: E1202 18:36:44.676674 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9feb9b1-2723-4414-bf7f-747c0295cb66" containerName="mariadb-database-create" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.676683 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9feb9b1-2723-4414-bf7f-747c0295cb66" containerName="mariadb-database-create" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.676974 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d3c516-e91a-49fc-b75a-f425cd4ccd5d" containerName="mariadb-account-create-update" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.676999 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9feb9b1-2723-4414-bf7f-747c0295cb66" containerName="mariadb-database-create" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.678219 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.691191 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qgmlw-config-p57xc"] Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.696642 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.727105 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-log-ovn\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.727586 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-scripts\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.727709 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.727740 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-additional-scripts\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.727765 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run-ovn\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.727784 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nr6p\" (UniqueName: \"kubernetes.io/projected/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-kube-api-access-7nr6p\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830154 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-log-ovn\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830406 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-scripts\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830451 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830471 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-additional-scripts\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830756 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830774 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-log-ovn\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830799 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run-ovn\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830819 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nr6p\" (UniqueName: \"kubernetes.io/projected/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-kube-api-access-7nr6p\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.830885 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run-ovn\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.832333 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-additional-scripts\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.832819 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-scripts\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:44 crc kubenswrapper[4878]: I1202 18:36:44.849493 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nr6p\" (UniqueName: \"kubernetes.io/projected/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-kube-api-access-7nr6p\") pod \"ovn-controller-qgmlw-config-p57xc\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:45 crc kubenswrapper[4878]: I1202 18:36:45.015211 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.199296 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.201971 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.210596 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.220463 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.268991 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.269502 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk6r5\" (UniqueName: \"kubernetes.io/projected/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-kube-api-access-wk6r5\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.269714 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-config-data\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.372684 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-config-data\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.372879 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.372975 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk6r5\" (UniqueName: \"kubernetes.io/projected/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-kube-api-access-wk6r5\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.380780 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-config-data\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.394214 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.399817 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk6r5\" (UniqueName: \"kubernetes.io/projected/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-kube-api-access-wk6r5\") pod \"mysqld-exporter-0\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " pod="openstack/mysqld-exporter-0" Dec 02 18:36:46 crc kubenswrapper[4878]: I1202 18:36:46.546609 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 18:36:49 crc kubenswrapper[4878]: I1202 18:36:49.332600 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qgmlw" podUID="1f049ebe-547b-40a2-8468-932cfc5051ea" containerName="ovn-controller" probeResult="failure" output=< Dec 02 18:36:49 crc kubenswrapper[4878]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 18:36:49 crc kubenswrapper[4878]: > Dec 02 18:36:52 crc kubenswrapper[4878]: E1202 18:36:52.555980 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 02 18:36:52 crc kubenswrapper[4878]: E1202 18:36:52.556834 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8twdm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-wkmpm_openstack(9c0ade7f-399e-4b2f-a250-5f6a47e90baf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:36:52 crc kubenswrapper[4878]: E1202 18:36:52.558294 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-wkmpm" podUID="9c0ade7f-399e-4b2f-a250-5f6a47e90baf" Dec 02 18:36:52 crc kubenswrapper[4878]: E1202 18:36:52.880553 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-wkmpm" podUID="9c0ade7f-399e-4b2f-a250-5f6a47e90baf" Dec 02 18:36:54 crc kubenswrapper[4878]: I1202 18:36:54.388772 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a601-account-create-update-2kkrr"] Dec 02 18:36:54 crc kubenswrapper[4878]: I1202 18:36:54.431889 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qgmlw" podUID="1f049ebe-547b-40a2-8468-932cfc5051ea" containerName="ovn-controller" probeResult="failure" output=< Dec 02 18:36:54 crc kubenswrapper[4878]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 18:36:54 crc kubenswrapper[4878]: > Dec 02 18:36:54 crc kubenswrapper[4878]: W1202 18:36:54.437060 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90576306_b86d_4017_84fb_19c00f9630a0.slice/crio-7de09b95ec1ab984bad4fff9a5c62e2e7fc1795e8620da9487fa6e51ccbf0abe WatchSource:0}: Error finding container 7de09b95ec1ab984bad4fff9a5c62e2e7fc1795e8620da9487fa6e51ccbf0abe: Status 404 returned error can't find the container with id 7de09b95ec1ab984bad4fff9a5c62e2e7fc1795e8620da9487fa6e51ccbf0abe Dec 02 18:36:54 crc kubenswrapper[4878]: I1202 18:36:54.458088 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 18:36:54 crc kubenswrapper[4878]: I1202 18:36:54.907710 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerStarted","Data":"3469a41f69ed78a577ddde5579b2f4152902bdc85697ddfe3e36b98184a6f1b2"} Dec 02 18:36:54 crc kubenswrapper[4878]: I1202 18:36:54.968602 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rfcql" event={"ID":"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc","Type":"ContainerStarted","Data":"993764e9fb636f39e634f34fbbf96c46ca38657d075cf76a7f8c016de0d2e947"} Dec 02 18:36:54 crc kubenswrapper[4878]: I1202 18:36:54.972010 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a601-account-create-update-2kkrr" event={"ID":"90576306-b86d-4017-84fb-19c00f9630a0","Type":"ContainerStarted","Data":"d1b26df3c986f36a878e7d14ccf9008789835efafa93c0da17db2336fbe9f0e9"} Dec 02 18:36:54 crc kubenswrapper[4878]: I1202 18:36:54.972081 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a601-account-create-update-2kkrr" event={"ID":"90576306-b86d-4017-84fb-19c00f9630a0","Type":"ContainerStarted","Data":"7de09b95ec1ab984bad4fff9a5c62e2e7fc1795e8620da9487fa6e51ccbf0abe"} Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.042192 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.060147 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.067227 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.070305 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-srxml"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.079077 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ppjq4"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.086646 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=26.218033061 podStartE2EDuration="1m30.086619551s" podCreationTimestamp="2025-12-02 18:35:25 +0000 UTC" firstStartedPulling="2025-12-02 18:35:49.809354084 +0000 UTC m=+1259.498972965" lastFinishedPulling="2025-12-02 18:36:53.677940564 +0000 UTC m=+1323.367559455" observedRunningTime="2025-12-02 18:36:54.978538327 +0000 UTC m=+1324.668157208" watchObservedRunningTime="2025-12-02 18:36:55.086619551 +0000 UTC m=+1324.776238432" Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.111305 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7e48-account-create-update-d5z4q"] Dec 02 18:36:55 crc kubenswrapper[4878]: W1202 18:36:55.121372 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9672009_e2c7_4540_91ea_737ef2418ac1.slice/crio-d0858b5e73ee958dda067775b4d1844abadc11fd3f7dad8630f76cb6b1d3efc2 WatchSource:0}: Error finding container d0858b5e73ee958dda067775b4d1844abadc11fd3f7dad8630f76cb6b1d3efc2: Status 404 returned error can't find the container with id d0858b5e73ee958dda067775b4d1844abadc11fd3f7dad8630f76cb6b1d3efc2 Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.129138 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-6dj4k"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.143338 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b7e2-account-create-update-5hfz5"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.146705 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-92nsw"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.149551 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rfcql" podStartSLOduration=9.092804957 podStartE2EDuration="25.149527972s" podCreationTimestamp="2025-12-02 18:36:30 +0000 UTC" firstStartedPulling="2025-12-02 18:36:37.510076261 +0000 UTC m=+1307.199695142" lastFinishedPulling="2025-12-02 18:36:53.566799256 +0000 UTC m=+1323.256418157" observedRunningTime="2025-12-02 18:36:55.060318514 +0000 UTC m=+1324.749937395" watchObservedRunningTime="2025-12-02 18:36:55.149527972 +0000 UTC m=+1324.839146853" Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.416324 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b33b-account-create-update-5f77r"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.442402 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qgmlw-config-p57xc"] Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.455737 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hldwm"] Dec 02 18:36:55 crc kubenswrapper[4878]: W1202 18:36:55.525361 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdefdd6a_c268_4bf5_bf46_55d50fe2f83d.slice/crio-eb12df9744a18c4e58c9cbd726a2fd41f22e439b2e89ee5cc28ab6fc9d1bfc90 WatchSource:0}: Error finding container eb12df9744a18c4e58c9cbd726a2fd41f22e439b2e89ee5cc28ab6fc9d1bfc90: Status 404 returned error can't find the container with id eb12df9744a18c4e58c9cbd726a2fd41f22e439b2e89ee5cc28ab6fc9d1bfc90 Dec 02 18:36:55 crc kubenswrapper[4878]: I1202 18:36:55.586110 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.017392 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ppjq4" event={"ID":"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1","Type":"ContainerStarted","Data":"8bb5e914d72d79ab66da3660027d8f7b63ca795c94b88fe0dfc94f55126e1ffe"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.018362 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ppjq4" event={"ID":"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1","Type":"ContainerStarted","Data":"598437c19790ecc038ec9a52bfb743a8a94b98b533ffb98f8dcdd7971addc455"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.030092 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b7e2-account-create-update-5hfz5" event={"ID":"cefde4cb-7350-4664-8f73-fabfecc591eb","Type":"ContainerStarted","Data":"f024efb68a03b29cc1f42226e55c4ef21f0ca1a387ea330c153c02ca1080382b"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.030163 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b7e2-account-create-update-5hfz5" event={"ID":"cefde4cb-7350-4664-8f73-fabfecc591eb","Type":"ContainerStarted","Data":"23478727428fe22333b2c3b42ca18d9e73c24e4800c32496080ec7017bb17e78"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.069347 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b33b-account-create-update-5f77r" event={"ID":"cae0102d-4eba-4ca3-915b-0156d65616fa","Type":"ContainerStarted","Data":"1d313e4650de121330d0696ccdcebdc3d4b4e094adbc7d4935fa4ea7e3a03fbb"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.083181 4878 generic.go:334] "Generic (PLEG): container finished" podID="90576306-b86d-4017-84fb-19c00f9630a0" containerID="d1b26df3c986f36a878e7d14ccf9008789835efafa93c0da17db2336fbe9f0e9" exitCode=0 Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.083283 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a601-account-create-update-2kkrr" event={"ID":"90576306-b86d-4017-84fb-19c00f9630a0","Type":"ContainerDied","Data":"d1b26df3c986f36a878e7d14ccf9008789835efafa93c0da17db2336fbe9f0e9"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.092614 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54","Type":"ContainerStarted","Data":"c7df923115c99927b6e20585f5379acc66689bb69eed251c72d5631e52112ac5"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.096856 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-92nsw" event={"ID":"e9672009-e2c7-4540-91ea-737ef2418ac1","Type":"ContainerStarted","Data":"d0858b5e73ee958dda067775b4d1844abadc11fd3f7dad8630f76cb6b1d3efc2"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.097780 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-ppjq4" podStartSLOduration=14.097748876 podStartE2EDuration="14.097748876s" podCreationTimestamp="2025-12-02 18:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:56.049761387 +0000 UTC m=+1325.739380268" watchObservedRunningTime="2025-12-02 18:36:56.097748876 +0000 UTC m=+1325.787367757" Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.100343 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-srxml" event={"ID":"391eccae-595f-4859-b88b-933307305613","Type":"ContainerStarted","Data":"0c18d7b52deca67ac54e1322a747c8b8a715b13364819888fb0266810ff89f44"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.100669 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-srxml" event={"ID":"391eccae-595f-4859-b88b-933307305613","Type":"ContainerStarted","Data":"46bf3bc2f434fc6c582ba3ec2663e2c75e1d9c16b32215d014bbcb77a85fa904"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.103003 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hldwm" event={"ID":"628e53e1-08c1-49e9-8fec-b6e0547c9e6d","Type":"ContainerStarted","Data":"8f82defeb5ff0ab55028296e6bc8e9a6f12833327f479ab1e995602fed5defcf"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.106617 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7e48-account-create-update-d5z4q" event={"ID":"9669f740-c9d3-4c55-866d-a44471c3aa1c","Type":"ContainerStarted","Data":"cfb044178cd269dfa607bc6435f583099465f52b83abe503abfd8ff6081b84b4"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.106661 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7e48-account-create-update-d5z4q" event={"ID":"9669f740-c9d3-4c55-866d-a44471c3aa1c","Type":"ContainerStarted","Data":"99866242555bc9d354c60b498b896bde03332d4e1000008eb88796948be9c8af"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.111513 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-b7e2-account-create-update-5hfz5" podStartSLOduration=13.111476752 podStartE2EDuration="13.111476752s" podCreationTimestamp="2025-12-02 18:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:56.075110724 +0000 UTC m=+1325.764729605" watchObservedRunningTime="2025-12-02 18:36:56.111476752 +0000 UTC m=+1325.801095633" Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.117527 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6dj4k" event={"ID":"56f602d3-e640-4248-b53e-201b3556aa6f","Type":"ContainerStarted","Data":"07501c808a51dbe70f72b4f8eb87ee2e0183629d7ca671ce9609a51bcbe616f3"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.129795 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qgmlw-config-p57xc" event={"ID":"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d","Type":"ContainerStarted","Data":"eb12df9744a18c4e58c9cbd726a2fd41f22e439b2e89ee5cc28ab6fc9d1bfc90"} Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.194908 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-srxml" podStartSLOduration=14.194882789 podStartE2EDuration="14.194882789s" podCreationTimestamp="2025-12-02 18:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:56.130736999 +0000 UTC m=+1325.820355880" watchObservedRunningTime="2025-12-02 18:36:56.194882789 +0000 UTC m=+1325.884501660" Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.198727 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-7e48-account-create-update-d5z4q" podStartSLOduration=14.198718337 podStartE2EDuration="14.198718337s" podCreationTimestamp="2025-12-02 18:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:36:56.1613893 +0000 UTC m=+1325.851008201" watchObservedRunningTime="2025-12-02 18:36:56.198718337 +0000 UTC m=+1325.888337208" Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.737684 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.738003 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 18:36:56 crc kubenswrapper[4878]: I1202 18:36:56.744150 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.147399 4878 generic.go:334] "Generic (PLEG): container finished" podID="56f602d3-e640-4248-b53e-201b3556aa6f" containerID="c744fc7a4745cb90252fa158ae2d36fed75660311b4454064242dd13616b386c" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.147723 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6dj4k" event={"ID":"56f602d3-e640-4248-b53e-201b3556aa6f","Type":"ContainerDied","Data":"c744fc7a4745cb90252fa158ae2d36fed75660311b4454064242dd13616b386c"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.159603 4878 generic.go:334] "Generic (PLEG): container finished" podID="36c2cda1-016a-4eca-9a7f-d4c2e43ccff1" containerID="8bb5e914d72d79ab66da3660027d8f7b63ca795c94b88fe0dfc94f55126e1ffe" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.159824 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ppjq4" event={"ID":"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1","Type":"ContainerDied","Data":"8bb5e914d72d79ab66da3660027d8f7b63ca795c94b88fe0dfc94f55126e1ffe"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.174158 4878 generic.go:334] "Generic (PLEG): container finished" podID="cefde4cb-7350-4664-8f73-fabfecc591eb" containerID="f024efb68a03b29cc1f42226e55c4ef21f0ca1a387ea330c153c02ca1080382b" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.174257 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b7e2-account-create-update-5hfz5" event={"ID":"cefde4cb-7350-4664-8f73-fabfecc591eb","Type":"ContainerDied","Data":"f024efb68a03b29cc1f42226e55c4ef21f0ca1a387ea330c153c02ca1080382b"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.181046 4878 generic.go:334] "Generic (PLEG): container finished" podID="cae0102d-4eba-4ca3-915b-0156d65616fa" containerID="2904b18252f988f050784a0e760f3d41816725358db0b549960fecf3d8318ff0" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.181172 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b33b-account-create-update-5f77r" event={"ID":"cae0102d-4eba-4ca3-915b-0156d65616fa","Type":"ContainerDied","Data":"2904b18252f988f050784a0e760f3d41816725358db0b549960fecf3d8318ff0"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.190444 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a601-account-create-update-2kkrr" event={"ID":"90576306-b86d-4017-84fb-19c00f9630a0","Type":"ContainerDied","Data":"7de09b95ec1ab984bad4fff9a5c62e2e7fc1795e8620da9487fa6e51ccbf0abe"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.190661 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de09b95ec1ab984bad4fff9a5c62e2e7fc1795e8620da9487fa6e51ccbf0abe" Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.193709 4878 generic.go:334] "Generic (PLEG): container finished" podID="fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" containerID="3ee110d7786b3994c541df382097cd29ed6bf3fbdb3b04c89a2e2c5b959c0266" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.193938 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qgmlw-config-p57xc" event={"ID":"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d","Type":"ContainerDied","Data":"3ee110d7786b3994c541df382097cd29ed6bf3fbdb3b04c89a2e2c5b959c0266"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.196089 4878 generic.go:334] "Generic (PLEG): container finished" podID="391eccae-595f-4859-b88b-933307305613" containerID="0c18d7b52deca67ac54e1322a747c8b8a715b13364819888fb0266810ff89f44" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.196164 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-srxml" event={"ID":"391eccae-595f-4859-b88b-933307305613","Type":"ContainerDied","Data":"0c18d7b52deca67ac54e1322a747c8b8a715b13364819888fb0266810ff89f44"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.201585 4878 generic.go:334] "Generic (PLEG): container finished" podID="628e53e1-08c1-49e9-8fec-b6e0547c9e6d" containerID="42691b44e01ae15ba0eccbd2f9676f2adc242b5c2b983dddaecb6272ce1fb285" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.201730 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hldwm" event={"ID":"628e53e1-08c1-49e9-8fec-b6e0547c9e6d","Type":"ContainerDied","Data":"42691b44e01ae15ba0eccbd2f9676f2adc242b5c2b983dddaecb6272ce1fb285"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.214533 4878 generic.go:334] "Generic (PLEG): container finished" podID="9669f740-c9d3-4c55-866d-a44471c3aa1c" containerID="cfb044178cd269dfa607bc6435f583099465f52b83abe503abfd8ff6081b84b4" exitCode=0 Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.215069 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7e48-account-create-update-d5z4q" event={"ID":"9669f740-c9d3-4c55-866d-a44471c3aa1c","Type":"ContainerDied","Data":"cfb044178cd269dfa607bc6435f583099465f52b83abe503abfd8ff6081b84b4"} Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.216338 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.281409 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.380424 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90576306-b86d-4017-84fb-19c00f9630a0-operator-scripts\") pod \"90576306-b86d-4017-84fb-19c00f9630a0\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.380766 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55n5h\" (UniqueName: \"kubernetes.io/projected/90576306-b86d-4017-84fb-19c00f9630a0-kube-api-access-55n5h\") pod \"90576306-b86d-4017-84fb-19c00f9630a0\" (UID: \"90576306-b86d-4017-84fb-19c00f9630a0\") " Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.381927 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90576306-b86d-4017-84fb-19c00f9630a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90576306-b86d-4017-84fb-19c00f9630a0" (UID: "90576306-b86d-4017-84fb-19c00f9630a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.400589 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90576306-b86d-4017-84fb-19c00f9630a0-kube-api-access-55n5h" (OuterVolumeSpecName: "kube-api-access-55n5h") pod "90576306-b86d-4017-84fb-19c00f9630a0" (UID: "90576306-b86d-4017-84fb-19c00f9630a0"). InnerVolumeSpecName "kube-api-access-55n5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.484347 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55n5h\" (UniqueName: \"kubernetes.io/projected/90576306-b86d-4017-84fb-19c00f9630a0-kube-api-access-55n5h\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:57 crc kubenswrapper[4878]: I1202 18:36:57.484386 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90576306-b86d-4017-84fb-19c00f9630a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:36:58 crc kubenswrapper[4878]: I1202 18:36:58.235354 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54","Type":"ContainerStarted","Data":"97376565e713d1ae53c4c2732e707a792e928bf838763d84edd735934089b938"} Dec 02 18:36:58 crc kubenswrapper[4878]: I1202 18:36:58.235592 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a601-account-create-update-2kkrr" Dec 02 18:36:58 crc kubenswrapper[4878]: I1202 18:36:58.272886 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=10.222997721 podStartE2EDuration="12.272867118s" podCreationTimestamp="2025-12-02 18:36:46 +0000 UTC" firstStartedPulling="2025-12-02 18:36:55.016283379 +0000 UTC m=+1324.705902260" lastFinishedPulling="2025-12-02 18:36:57.066152776 +0000 UTC m=+1326.755771657" observedRunningTime="2025-12-02 18:36:58.267067538 +0000 UTC m=+1327.956686419" watchObservedRunningTime="2025-12-02 18:36:58.272867118 +0000 UTC m=+1327.962485999" Dec 02 18:36:59 crc kubenswrapper[4878]: I1202 18:36:59.093748 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:36:59 crc kubenswrapper[4878]: E1202 18:36:59.093986 4878 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 18:36:59 crc kubenswrapper[4878]: E1202 18:36:59.094310 4878 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 18:36:59 crc kubenswrapper[4878]: E1202 18:36:59.094387 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift podName:5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f nodeName:}" failed. No retries permitted until 2025-12-02 18:37:31.09436683 +0000 UTC m=+1360.783985711 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift") pod "swift-storage-0" (UID: "5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f") : configmap "swift-ring-files" not found Dec 02 18:36:59 crc kubenswrapper[4878]: I1202 18:36:59.463103 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qgmlw" Dec 02 18:37:00 crc kubenswrapper[4878]: I1202 18:37:00.509247 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:37:00 crc kubenswrapper[4878]: I1202 18:37:00.509559 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="prometheus" containerID="cri-o://ddf160d59512ed95e7f7a3a1ff431f3e8a032dd8ea01919e40dc655a958d69be" gracePeriod=600 Dec 02 18:37:00 crc kubenswrapper[4878]: I1202 18:37:00.509622 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="thanos-sidecar" containerID="cri-o://3469a41f69ed78a577ddde5579b2f4152902bdc85697ddfe3e36b98184a6f1b2" gracePeriod=600 Dec 02 18:37:00 crc kubenswrapper[4878]: I1202 18:37:00.509712 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="config-reloader" containerID="cri-o://cccfecfab7b741247d179bf0e549c38f82cfb21ac8a57ef168146ff87c1823b5" gracePeriod=600 Dec 02 18:37:01 crc kubenswrapper[4878]: I1202 18:37:01.277019 4878 generic.go:334] "Generic (PLEG): container finished" podID="80ade876-344b-415c-9609-6477205860c9" containerID="3469a41f69ed78a577ddde5579b2f4152902bdc85697ddfe3e36b98184a6f1b2" exitCode=0 Dec 02 18:37:01 crc kubenswrapper[4878]: I1202 18:37:01.277426 4878 generic.go:334] "Generic (PLEG): container finished" podID="80ade876-344b-415c-9609-6477205860c9" containerID="cccfecfab7b741247d179bf0e549c38f82cfb21ac8a57ef168146ff87c1823b5" exitCode=0 Dec 02 18:37:01 crc kubenswrapper[4878]: I1202 18:37:01.277447 4878 generic.go:334] "Generic (PLEG): container finished" podID="80ade876-344b-415c-9609-6477205860c9" containerID="ddf160d59512ed95e7f7a3a1ff431f3e8a032dd8ea01919e40dc655a958d69be" exitCode=0 Dec 02 18:37:01 crc kubenswrapper[4878]: I1202 18:37:01.277091 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerDied","Data":"3469a41f69ed78a577ddde5579b2f4152902bdc85697ddfe3e36b98184a6f1b2"} Dec 02 18:37:01 crc kubenswrapper[4878]: I1202 18:37:01.277538 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerDied","Data":"cccfecfab7b741247d179bf0e549c38f82cfb21ac8a57ef168146ff87c1823b5"} Dec 02 18:37:01 crc kubenswrapper[4878]: I1202 18:37:01.277552 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerDied","Data":"ddf160d59512ed95e7f7a3a1ff431f3e8a032dd8ea01919e40dc655a958d69be"} Dec 02 18:37:01 crc kubenswrapper[4878]: I1202 18:37:01.739553 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": dial tcp 10.217.0.137:9090: connect: connection refused" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.347956 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b33b-account-create-update-5f77r" event={"ID":"cae0102d-4eba-4ca3-915b-0156d65616fa","Type":"ContainerDied","Data":"1d313e4650de121330d0696ccdcebdc3d4b4e094adbc7d4935fa4ea7e3a03fbb"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.348754 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d313e4650de121330d0696ccdcebdc3d4b4e094adbc7d4935fa4ea7e3a03fbb" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.368225 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-6dj4k" event={"ID":"56f602d3-e640-4248-b53e-201b3556aa6f","Type":"ContainerDied","Data":"07501c808a51dbe70f72b4f8eb87ee2e0183629d7ca671ce9609a51bcbe616f3"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.368291 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07501c808a51dbe70f72b4f8eb87ee2e0183629d7ca671ce9609a51bcbe616f3" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.377912 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qgmlw-config-p57xc" event={"ID":"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d","Type":"ContainerDied","Data":"eb12df9744a18c4e58c9cbd726a2fd41f22e439b2e89ee5cc28ab6fc9d1bfc90"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.377981 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb12df9744a18c4e58c9cbd726a2fd41f22e439b2e89ee5cc28ab6fc9d1bfc90" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.395856 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ppjq4" event={"ID":"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1","Type":"ContainerDied","Data":"598437c19790ecc038ec9a52bfb743a8a94b98b533ffb98f8dcdd7971addc455"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.395916 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="598437c19790ecc038ec9a52bfb743a8a94b98b533ffb98f8dcdd7971addc455" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.407825 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-srxml" event={"ID":"391eccae-595f-4859-b88b-933307305613","Type":"ContainerDied","Data":"46bf3bc2f434fc6c582ba3ec2663e2c75e1d9c16b32215d014bbcb77a85fa904"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.407877 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46bf3bc2f434fc6c582ba3ec2663e2c75e1d9c16b32215d014bbcb77a85fa904" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.426862 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hldwm" event={"ID":"628e53e1-08c1-49e9-8fec-b6e0547c9e6d","Type":"ContainerDied","Data":"8f82defeb5ff0ab55028296e6bc8e9a6f12833327f479ab1e995602fed5defcf"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.426944 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f82defeb5ff0ab55028296e6bc8e9a6f12833327f479ab1e995602fed5defcf" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.436414 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7e48-account-create-update-d5z4q" event={"ID":"9669f740-c9d3-4c55-866d-a44471c3aa1c","Type":"ContainerDied","Data":"99866242555bc9d354c60b498b896bde03332d4e1000008eb88796948be9c8af"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.436470 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99866242555bc9d354c60b498b896bde03332d4e1000008eb88796948be9c8af" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.436625 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hldwm" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.445147 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b7e2-account-create-update-5hfz5" event={"ID":"cefde4cb-7350-4664-8f73-fabfecc591eb","Type":"ContainerDied","Data":"23478727428fe22333b2c3b42ca18d9e73c24e4800c32496080ec7017bb17e78"} Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.445480 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23478727428fe22333b2c3b42ca18d9e73c24e4800c32496080ec7017bb17e78" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.475313 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.496903 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng95q\" (UniqueName: \"kubernetes.io/projected/cae0102d-4eba-4ca3-915b-0156d65616fa-kube-api-access-ng95q\") pod \"cae0102d-4eba-4ca3-915b-0156d65616fa\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.497659 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae0102d-4eba-4ca3-915b-0156d65616fa-operator-scripts\") pod \"cae0102d-4eba-4ca3-915b-0156d65616fa\" (UID: \"cae0102d-4eba-4ca3-915b-0156d65616fa\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.497800 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-operator-scripts\") pod \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.497931 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swkk5\" (UniqueName: \"kubernetes.io/projected/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-kube-api-access-swkk5\") pod \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\" (UID: \"628e53e1-08c1-49e9-8fec-b6e0547c9e6d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.498624 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "628e53e1-08c1-49e9-8fec-b6e0547c9e6d" (UID: "628e53e1-08c1-49e9-8fec-b6e0547c9e6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.498660 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae0102d-4eba-4ca3-915b-0156d65616fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cae0102d-4eba-4ca3-915b-0156d65616fa" (UID: "cae0102d-4eba-4ca3-915b-0156d65616fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.517951 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae0102d-4eba-4ca3-915b-0156d65616fa-kube-api-access-ng95q" (OuterVolumeSpecName: "kube-api-access-ng95q") pod "cae0102d-4eba-4ca3-915b-0156d65616fa" (UID: "cae0102d-4eba-4ca3-915b-0156d65616fa"). InnerVolumeSpecName "kube-api-access-ng95q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.523531 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-kube-api-access-swkk5" (OuterVolumeSpecName: "kube-api-access-swkk5") pod "628e53e1-08c1-49e9-8fec-b6e0547c9e6d" (UID: "628e53e1-08c1-49e9-8fec-b6e0547c9e6d"). InnerVolumeSpecName "kube-api-access-swkk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.581501 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-srxml" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.603809 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/391eccae-595f-4859-b88b-933307305613-operator-scripts\") pod \"391eccae-595f-4859-b88b-933307305613\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.604016 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4zl9\" (UniqueName: \"kubernetes.io/projected/391eccae-595f-4859-b88b-933307305613-kube-api-access-b4zl9\") pod \"391eccae-595f-4859-b88b-933307305613\" (UID: \"391eccae-595f-4859-b88b-933307305613\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.604771 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng95q\" (UniqueName: \"kubernetes.io/projected/cae0102d-4eba-4ca3-915b-0156d65616fa-kube-api-access-ng95q\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.604787 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae0102d-4eba-4ca3-915b-0156d65616fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.604798 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.604806 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swkk5\" (UniqueName: \"kubernetes.io/projected/628e53e1-08c1-49e9-8fec-b6e0547c9e6d-kube-api-access-swkk5\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.605891 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/391eccae-595f-4859-b88b-933307305613-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "391eccae-595f-4859-b88b-933307305613" (UID: "391eccae-595f-4859-b88b-933307305613"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.615447 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391eccae-595f-4859-b88b-933307305613-kube-api-access-b4zl9" (OuterVolumeSpecName: "kube-api-access-b4zl9") pod "391eccae-595f-4859-b88b-933307305613" (UID: "391eccae-595f-4859-b88b-933307305613"). InnerVolumeSpecName "kube-api-access-b4zl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.655185 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ppjq4" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.688854 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.711057 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749046 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-operator-scripts\") pod \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749315 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-log-ovn\") pod \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749386 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhdk4\" (UniqueName: \"kubernetes.io/projected/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-kube-api-access-rhdk4\") pod \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\" (UID: \"36c2cda1-016a-4eca-9a7f-d4c2e43ccff1\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749495 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-scripts\") pod \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749561 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-additional-scripts\") pod \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749615 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run\") pod \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749654 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nr6p\" (UniqueName: \"kubernetes.io/projected/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-kube-api-access-7nr6p\") pod \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.749699 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run-ovn\") pod \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\" (UID: \"fdefdd6a-c268-4bf5-bf46-55d50fe2f83d\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.750672 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/391eccae-595f-4859-b88b-933307305613-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.750700 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4zl9\" (UniqueName: \"kubernetes.io/projected/391eccae-595f-4859-b88b-933307305613-kube-api-access-b4zl9\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.758396 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" (UID: "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.763825 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run" (OuterVolumeSpecName: "var-run") pod "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" (UID: "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.765215 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-scripts" (OuterVolumeSpecName: "scripts") pod "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" (UID: "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.766700 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.766981 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" (UID: "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.773566 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" (UID: "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.775433 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36c2cda1-016a-4eca-9a7f-d4c2e43ccff1" (UID: "36c2cda1-016a-4eca-9a7f-d4c2e43ccff1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.775954 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-kube-api-access-7nr6p" (OuterVolumeSpecName: "kube-api-access-7nr6p") pod "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" (UID: "fdefdd6a-c268-4bf5-bf46-55d50fe2f83d"). InnerVolumeSpecName "kube-api-access-7nr6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.778629 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-kube-api-access-rhdk4" (OuterVolumeSpecName: "kube-api-access-rhdk4") pod "36c2cda1-016a-4eca-9a7f-d4c2e43ccff1" (UID: "36c2cda1-016a-4eca-9a7f-d4c2e43ccff1"). InnerVolumeSpecName "kube-api-access-rhdk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.817169 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6dj4k" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.852629 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcmwr\" (UniqueName: \"kubernetes.io/projected/9669f740-c9d3-4c55-866d-a44471c3aa1c-kube-api-access-rcmwr\") pod \"9669f740-c9d3-4c55-866d-a44471c3aa1c\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.852734 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9669f740-c9d3-4c55-866d-a44471c3aa1c-operator-scripts\") pod \"9669f740-c9d3-4c55-866d-a44471c3aa1c\" (UID: \"9669f740-c9d3-4c55-866d-a44471c3aa1c\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.852933 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdhdj\" (UniqueName: \"kubernetes.io/projected/cefde4cb-7350-4664-8f73-fabfecc591eb-kube-api-access-sdhdj\") pod \"cefde4cb-7350-4664-8f73-fabfecc591eb\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.852984 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefde4cb-7350-4664-8f73-fabfecc591eb-operator-scripts\") pod \"cefde4cb-7350-4664-8f73-fabfecc591eb\" (UID: \"cefde4cb-7350-4664-8f73-fabfecc591eb\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854657 4878 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854675 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhdk4\" (UniqueName: \"kubernetes.io/projected/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-kube-api-access-rhdk4\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854688 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854697 4878 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854707 4878 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854715 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nr6p\" (UniqueName: \"kubernetes.io/projected/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-kube-api-access-7nr6p\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854723 4878 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.854732 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.860429 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cefde4cb-7350-4664-8f73-fabfecc591eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cefde4cb-7350-4664-8f73-fabfecc591eb" (UID: "cefde4cb-7350-4664-8f73-fabfecc591eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.860948 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9669f740-c9d3-4c55-866d-a44471c3aa1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9669f740-c9d3-4c55-866d-a44471c3aa1c" (UID: "9669f740-c9d3-4c55-866d-a44471c3aa1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.900138 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9669f740-c9d3-4c55-866d-a44471c3aa1c-kube-api-access-rcmwr" (OuterVolumeSpecName: "kube-api-access-rcmwr") pod "9669f740-c9d3-4c55-866d-a44471c3aa1c" (UID: "9669f740-c9d3-4c55-866d-a44471c3aa1c"). InnerVolumeSpecName "kube-api-access-rcmwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.911625 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefde4cb-7350-4664-8f73-fabfecc591eb-kube-api-access-sdhdj" (OuterVolumeSpecName: "kube-api-access-sdhdj") pod "cefde4cb-7350-4664-8f73-fabfecc591eb" (UID: "cefde4cb-7350-4664-8f73-fabfecc591eb"). InnerVolumeSpecName "kube-api-access-sdhdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.955946 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr4gb\" (UniqueName: \"kubernetes.io/projected/56f602d3-e640-4248-b53e-201b3556aa6f-kube-api-access-vr4gb\") pod \"56f602d3-e640-4248-b53e-201b3556aa6f\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.956274 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f602d3-e640-4248-b53e-201b3556aa6f-operator-scripts\") pod \"56f602d3-e640-4248-b53e-201b3556aa6f\" (UID: \"56f602d3-e640-4248-b53e-201b3556aa6f\") " Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.956944 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f602d3-e640-4248-b53e-201b3556aa6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56f602d3-e640-4248-b53e-201b3556aa6f" (UID: "56f602d3-e640-4248-b53e-201b3556aa6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.957390 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcmwr\" (UniqueName: \"kubernetes.io/projected/9669f740-c9d3-4c55-866d-a44471c3aa1c-kube-api-access-rcmwr\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.957414 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9669f740-c9d3-4c55-866d-a44471c3aa1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.957430 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56f602d3-e640-4248-b53e-201b3556aa6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.957442 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdhdj\" (UniqueName: \"kubernetes.io/projected/cefde4cb-7350-4664-8f73-fabfecc591eb-kube-api-access-sdhdj\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.957452 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefde4cb-7350-4664-8f73-fabfecc591eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:03 crc kubenswrapper[4878]: I1202 18:37:03.959813 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f602d3-e640-4248-b53e-201b3556aa6f-kube-api-access-vr4gb" (OuterVolumeSpecName: "kube-api-access-vr4gb") pod "56f602d3-e640-4248-b53e-201b3556aa6f" (UID: "56f602d3-e640-4248-b53e-201b3556aa6f"). InnerVolumeSpecName "kube-api-access-vr4gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.036809 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.059856 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr4gb\" (UniqueName: \"kubernetes.io/projected/56f602d3-e640-4248-b53e-201b3556aa6f-kube-api-access-vr4gb\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161121 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161347 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80ade876-344b-415c-9609-6477205860c9-prometheus-metric-storage-rulefiles-0\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161448 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-thanos-prometheus-http-client-file\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161516 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-config\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161598 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ade876-344b-415c-9609-6477205860c9-config-out\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161655 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz42x\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-kube-api-access-lz42x\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161694 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-tls-assets\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.161735 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-web-config\") pod \"80ade876-344b-415c-9609-6477205860c9\" (UID: \"80ade876-344b-415c-9609-6477205860c9\") " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.165265 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ade876-344b-415c-9609-6477205860c9-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.169031 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80ade876-344b-415c-9609-6477205860c9-config-out" (OuterVolumeSpecName: "config-out") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.169864 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-kube-api-access-lz42x" (OuterVolumeSpecName: "kube-api-access-lz42x") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "kube-api-access-lz42x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.175539 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.177312 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-config" (OuterVolumeSpecName: "config") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.201624 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.206710 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.242531 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-web-config" (OuterVolumeSpecName: "web-config") pod "80ade876-344b-415c-9609-6477205860c9" (UID: "80ade876-344b-415c-9609-6477205860c9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264400 4878 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80ade876-344b-415c-9609-6477205860c9-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264444 4878 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264455 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264464 4878 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ade876-344b-415c-9609-6477205860c9-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264473 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz42x\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-kube-api-access-lz42x\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264483 4878 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ade876-344b-415c-9609-6477205860c9-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264492 4878 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ade876-344b-415c-9609-6477205860c9-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.264527 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.329251 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.369544 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.459991 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hldwm" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.459981 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80ade876-344b-415c-9609-6477205860c9","Type":"ContainerDied","Data":"d58073eef0db504a0dd8d7cef1e0b92706f60a8d5fc6707879c6072624f1343d"} Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460030 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460063 4878 scope.go:117] "RemoveContainer" containerID="3469a41f69ed78a577ddde5579b2f4152902bdc85697ddfe3e36b98184a6f1b2" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460093 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-6dj4k" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460125 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b33b-account-create-update-5f77r" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460122 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b7e2-account-create-update-5hfz5" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460352 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ppjq4" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460368 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qgmlw-config-p57xc" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460374 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-srxml" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.460555 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7e48-account-create-update-d5z4q" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.587473 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.603703 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.631830 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633573 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="init-config-reloader" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633602 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="init-config-reloader" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633623 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f602d3-e640-4248-b53e-201b3556aa6f" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633634 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f602d3-e640-4248-b53e-201b3556aa6f" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633647 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="prometheus" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633655 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="prometheus" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633668 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391eccae-595f-4859-b88b-933307305613" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633675 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="391eccae-595f-4859-b88b-933307305613" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633699 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" containerName="ovn-config" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633712 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" containerName="ovn-config" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633733 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae0102d-4eba-4ca3-915b-0156d65616fa" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633740 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae0102d-4eba-4ca3-915b-0156d65616fa" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633755 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="config-reloader" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633773 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="config-reloader" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633789 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628e53e1-08c1-49e9-8fec-b6e0547c9e6d" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633794 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="628e53e1-08c1-49e9-8fec-b6e0547c9e6d" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633807 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9669f740-c9d3-4c55-866d-a44471c3aa1c" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633814 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="9669f740-c9d3-4c55-866d-a44471c3aa1c" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633824 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90576306-b86d-4017-84fb-19c00f9630a0" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633830 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="90576306-b86d-4017-84fb-19c00f9630a0" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633844 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefde4cb-7350-4664-8f73-fabfecc591eb" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633850 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefde4cb-7350-4664-8f73-fabfecc591eb" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633864 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="thanos-sidecar" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633870 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="thanos-sidecar" Dec 02 18:37:04 crc kubenswrapper[4878]: E1202 18:37:04.633883 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c2cda1-016a-4eca-9a7f-d4c2e43ccff1" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.633891 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c2cda1-016a-4eca-9a7f-d4c2e43ccff1" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634143 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f602d3-e640-4248-b53e-201b3556aa6f" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634161 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="config-reloader" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634176 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c2cda1-016a-4eca-9a7f-d4c2e43ccff1" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634182 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae0102d-4eba-4ca3-915b-0156d65616fa" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634197 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="628e53e1-08c1-49e9-8fec-b6e0547c9e6d" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634210 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="prometheus" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634216 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" containerName="ovn-config" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634223 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="9669f740-c9d3-4c55-866d-a44471c3aa1c" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634231 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ade876-344b-415c-9609-6477205860c9" containerName="thanos-sidecar" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634257 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="391eccae-595f-4859-b88b-933307305613" containerName="mariadb-database-create" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634269 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefde4cb-7350-4664-8f73-fabfecc591eb" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.634282 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="90576306-b86d-4017-84fb-19c00f9630a0" containerName="mariadb-account-create-update" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.639511 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.642126 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.642581 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.647794 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.647844 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.650216 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zwpt2" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.650548 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.652163 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.656427 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.782458 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.782611 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-config\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.782769 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80be1dfe-9477-4d5c-9b11-2664d4300eca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.782801 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80be1dfe-9477-4d5c-9b11-2664d4300eca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.782919 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8v8\" (UniqueName: \"kubernetes.io/projected/80be1dfe-9477-4d5c-9b11-2664d4300eca-kube-api-access-dq8v8\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.782966 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80be1dfe-9477-4d5c-9b11-2664d4300eca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.783009 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.783106 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.783197 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.783356 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.783489 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.885633 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80be1dfe-9477-4d5c-9b11-2664d4300eca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.885724 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80be1dfe-9477-4d5c-9b11-2664d4300eca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.885797 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8v8\" (UniqueName: \"kubernetes.io/projected/80be1dfe-9477-4d5c-9b11-2664d4300eca-kube-api-access-dq8v8\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.885840 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.885872 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80be1dfe-9477-4d5c-9b11-2664d4300eca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.885936 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.885991 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.886074 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.886152 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.886278 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.886329 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-config\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.888221 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80be1dfe-9477-4d5c-9b11-2664d4300eca-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.889264 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.899582 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-config\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.899637 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.899877 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80be1dfe-9477-4d5c-9b11-2664d4300eca-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.901756 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.902068 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.902526 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.902944 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80be1dfe-9477-4d5c-9b11-2664d4300eca-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.915393 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80be1dfe-9477-4d5c-9b11-2664d4300eca-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.925190 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8v8\" (UniqueName: \"kubernetes.io/projected/80be1dfe-9477-4d5c-9b11-2664d4300eca-kube-api-access-dq8v8\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.956034 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"prometheus-metric-storage-0\" (UID: \"80be1dfe-9477-4d5c-9b11-2664d4300eca\") " pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.966900 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ade876-344b-415c-9609-6477205860c9" path="/var/lib/kubelet/pods/80ade876-344b-415c-9609-6477205860c9/volumes" Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.968035 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qgmlw-config-p57xc"] Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.968072 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qgmlw-config-p57xc"] Dec 02 18:37:04 crc kubenswrapper[4878]: I1202 18:37:04.971943 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:05 crc kubenswrapper[4878]: I1202 18:37:05.655843 4878 scope.go:117] "RemoveContainer" containerID="cccfecfab7b741247d179bf0e549c38f82cfb21ac8a57ef168146ff87c1823b5" Dec 02 18:37:05 crc kubenswrapper[4878]: I1202 18:37:05.689475 4878 scope.go:117] "RemoveContainer" containerID="ddf160d59512ed95e7f7a3a1ff431f3e8a032dd8ea01919e40dc655a958d69be" Dec 02 18:37:05 crc kubenswrapper[4878]: I1202 18:37:05.813643 4878 scope.go:117] "RemoveContainer" containerID="74ead525d39898e37e26b20a9217ccfdad6d91900f12790e458abb17c1b52507" Dec 02 18:37:06 crc kubenswrapper[4878]: I1202 18:37:06.206902 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 18:37:06 crc kubenswrapper[4878]: I1202 18:37:06.498316 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80be1dfe-9477-4d5c-9b11-2664d4300eca","Type":"ContainerStarted","Data":"e994d4dccf052a267fdfecf968eeb926778c128987528a3d1f28a7dcf4cca1b8"} Dec 02 18:37:06 crc kubenswrapper[4878]: I1202 18:37:06.957926 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdefdd6a-c268-4bf5-bf46-55d50fe2f83d" path="/var/lib/kubelet/pods/fdefdd6a-c268-4bf5-bf46-55d50fe2f83d/volumes" Dec 02 18:37:07 crc kubenswrapper[4878]: I1202 18:37:07.516561 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-92nsw" event={"ID":"e9672009-e2c7-4540-91ea-737ef2418ac1","Type":"ContainerStarted","Data":"a17340f3089776901697d2a759a35338cb54fa2cfc6960e53b1881784621ce1c"} Dec 02 18:37:08 crc kubenswrapper[4878]: I1202 18:37:08.551659 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wkmpm" event={"ID":"9c0ade7f-399e-4b2f-a250-5f6a47e90baf","Type":"ContainerStarted","Data":"2c35cc5e2f37dc948eefb0fac2bc88ed1108661dde6f89516e1c5449e16ade97"} Dec 02 18:37:08 crc kubenswrapper[4878]: I1202 18:37:08.564956 4878 generic.go:334] "Generic (PLEG): container finished" podID="0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" containerID="993764e9fb636f39e634f34fbbf96c46ca38657d075cf76a7f8c016de0d2e947" exitCode=0 Dec 02 18:37:08 crc kubenswrapper[4878]: I1202 18:37:08.565542 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rfcql" event={"ID":"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc","Type":"ContainerDied","Data":"993764e9fb636f39e634f34fbbf96c46ca38657d075cf76a7f8c016de0d2e947"} Dec 02 18:37:08 crc kubenswrapper[4878]: I1202 18:37:08.597677 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wkmpm" podStartSLOduration=2.562724915 podStartE2EDuration="39.59765278s" podCreationTimestamp="2025-12-02 18:36:29 +0000 UTC" firstStartedPulling="2025-12-02 18:36:30.563463778 +0000 UTC m=+1300.253082659" lastFinishedPulling="2025-12-02 18:37:07.598391643 +0000 UTC m=+1337.288010524" observedRunningTime="2025-12-02 18:37:08.58348137 +0000 UTC m=+1338.273100251" watchObservedRunningTime="2025-12-02 18:37:08.59765278 +0000 UTC m=+1338.287271651" Dec 02 18:37:08 crc kubenswrapper[4878]: I1202 18:37:08.625806 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-92nsw" podStartSLOduration=18.626281621 podStartE2EDuration="26.625782883s" podCreationTimestamp="2025-12-02 18:36:42 +0000 UTC" firstStartedPulling="2025-12-02 18:36:55.193170566 +0000 UTC m=+1324.882789447" lastFinishedPulling="2025-12-02 18:37:03.192671838 +0000 UTC m=+1332.882290709" observedRunningTime="2025-12-02 18:37:08.621727326 +0000 UTC m=+1338.311346207" watchObservedRunningTime="2025-12-02 18:37:08.625782883 +0000 UTC m=+1338.315401754" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.038673 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.163476 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-scripts\") pod \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.163536 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbvbf\" (UniqueName: \"kubernetes.io/projected/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-kube-api-access-nbvbf\") pod \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.163559 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-etc-swift\") pod \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.163664 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-dispersionconf\") pod \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.163754 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-ring-data-devices\") pod \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.163787 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-swiftconf\") pod \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.163814 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-combined-ca-bundle\") pod \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\" (UID: \"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc\") " Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.165191 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" (UID: "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.165524 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" (UID: "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.169977 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-kube-api-access-nbvbf" (OuterVolumeSpecName: "kube-api-access-nbvbf") pod "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" (UID: "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc"). InnerVolumeSpecName "kube-api-access-nbvbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.174901 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" (UID: "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.215999 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-scripts" (OuterVolumeSpecName: "scripts") pod "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" (UID: "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.216725 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" (UID: "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.246405 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" (UID: "0e84c0f0-1e32-4c9b-b21d-f49bb06863fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.268033 4878 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.268185 4878 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.268289 4878 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.268366 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.268442 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.268523 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbvbf\" (UniqueName: \"kubernetes.io/projected/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-kube-api-access-nbvbf\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.268599 4878 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e84c0f0-1e32-4c9b-b21d-f49bb06863fc-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.590377 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rfcql" event={"ID":"0e84c0f0-1e32-4c9b-b21d-f49bb06863fc","Type":"ContainerDied","Data":"3a8a8fe6ebdd838ec2dc279a23bc3a5c2cb95eac276ded8b6de1b92d3cce2c17"} Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.590872 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a8a8fe6ebdd838ec2dc279a23bc3a5c2cb95eac276ded8b6de1b92d3cce2c17" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.590395 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rfcql" Dec 02 18:37:10 crc kubenswrapper[4878]: I1202 18:37:10.592894 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80be1dfe-9477-4d5c-9b11-2664d4300eca","Type":"ContainerStarted","Data":"760a6d4bab1115e1e6014d030b304a1edb7366161ac331026a9c0bc9c0bcf7d5"} Dec 02 18:37:12 crc kubenswrapper[4878]: I1202 18:37:12.650976 4878 generic.go:334] "Generic (PLEG): container finished" podID="e9672009-e2c7-4540-91ea-737ef2418ac1" containerID="a17340f3089776901697d2a759a35338cb54fa2cfc6960e53b1881784621ce1c" exitCode=0 Dec 02 18:37:12 crc kubenswrapper[4878]: I1202 18:37:12.651067 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-92nsw" event={"ID":"e9672009-e2c7-4540-91ea-737ef2418ac1","Type":"ContainerDied","Data":"a17340f3089776901697d2a759a35338cb54fa2cfc6960e53b1881784621ce1c"} Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.040884 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-92nsw" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.163491 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-config-data\") pod \"e9672009-e2c7-4540-91ea-737ef2418ac1\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.163730 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltqn5\" (UniqueName: \"kubernetes.io/projected/e9672009-e2c7-4540-91ea-737ef2418ac1-kube-api-access-ltqn5\") pod \"e9672009-e2c7-4540-91ea-737ef2418ac1\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.163940 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-combined-ca-bundle\") pod \"e9672009-e2c7-4540-91ea-737ef2418ac1\" (UID: \"e9672009-e2c7-4540-91ea-737ef2418ac1\") " Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.170855 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9672009-e2c7-4540-91ea-737ef2418ac1-kube-api-access-ltqn5" (OuterVolumeSpecName: "kube-api-access-ltqn5") pod "e9672009-e2c7-4540-91ea-737ef2418ac1" (UID: "e9672009-e2c7-4540-91ea-737ef2418ac1"). InnerVolumeSpecName "kube-api-access-ltqn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.195639 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9672009-e2c7-4540-91ea-737ef2418ac1" (UID: "e9672009-e2c7-4540-91ea-737ef2418ac1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.224427 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-config-data" (OuterVolumeSpecName: "config-data") pod "e9672009-e2c7-4540-91ea-737ef2418ac1" (UID: "e9672009-e2c7-4540-91ea-737ef2418ac1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.267594 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltqn5\" (UniqueName: \"kubernetes.io/projected/e9672009-e2c7-4540-91ea-737ef2418ac1-kube-api-access-ltqn5\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.267631 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.267640 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9672009-e2c7-4540-91ea-737ef2418ac1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.676349 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-92nsw" event={"ID":"e9672009-e2c7-4540-91ea-737ef2418ac1","Type":"ContainerDied","Data":"d0858b5e73ee958dda067775b4d1844abadc11fd3f7dad8630f76cb6b1d3efc2"} Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.676397 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0858b5e73ee958dda067775b4d1844abadc11fd3f7dad8630f76cb6b1d3efc2" Dec 02 18:37:14 crc kubenswrapper[4878]: I1202 18:37:14.676447 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-92nsw" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.022496 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-6wfdt"] Dec 02 18:37:15 crc kubenswrapper[4878]: E1202 18:37:15.023147 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9672009-e2c7-4540-91ea-737ef2418ac1" containerName="keystone-db-sync" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.023168 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9672009-e2c7-4540-91ea-737ef2418ac1" containerName="keystone-db-sync" Dec 02 18:37:15 crc kubenswrapper[4878]: E1202 18:37:15.023205 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" containerName="swift-ring-rebalance" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.023215 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" containerName="swift-ring-rebalance" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.023468 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9672009-e2c7-4540-91ea-737ef2418ac1" containerName="keystone-db-sync" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.023496 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e84c0f0-1e32-4c9b-b21d-f49bb06863fc" containerName="swift-ring-rebalance" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.024842 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.053354 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-6wfdt"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.085896 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rmddm"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.088212 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.129344 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.129788 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.129934 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.130099 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jjqdr" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.134509 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.197628 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rmddm"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337173 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-scripts\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337373 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtp94\" (UniqueName: \"kubernetes.io/projected/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-kube-api-access-vtp94\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337465 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-config\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337510 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-combined-ca-bundle\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337573 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337602 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-dns-svc\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337679 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337755 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szcbt\" (UniqueName: \"kubernetes.io/projected/27c73cdb-bd06-4b13-afe5-c486618cb0ec-kube-api-access-szcbt\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337826 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-config-data\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.337862 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-fernet-keys\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.338151 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-credential-keys\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.357304 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-xp7h4"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.362649 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.379476 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-v925n" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.379816 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.406979 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xp7h4"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.447671 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.447727 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-dns-svc\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.447809 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.447856 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szcbt\" (UniqueName: \"kubernetes.io/projected/27c73cdb-bd06-4b13-afe5-c486618cb0ec-kube-api-access-szcbt\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.447885 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-config-data\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.447910 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-fernet-keys\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.447996 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-credential-keys\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.448047 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-scripts\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.448088 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtp94\" (UniqueName: \"kubernetes.io/projected/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-kube-api-access-vtp94\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.448127 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-config\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.448155 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-combined-ca-bundle\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.448659 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.448727 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-dns-svc\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.450907 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.450958 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4czgq"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.451839 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-config\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.452542 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.463593 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-scripts\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.475500 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dx549" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.475768 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.475949 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.477590 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4czgq"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.481224 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-credential-keys\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.484094 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-config-data\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.484344 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-combined-ca-bundle\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.489318 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szcbt\" (UniqueName: \"kubernetes.io/projected/27c73cdb-bd06-4b13-afe5-c486618cb0ec-kube-api-access-szcbt\") pod \"dnsmasq-dns-f877ddd87-6wfdt\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.507364 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-fernet-keys\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.511414 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtp94\" (UniqueName: \"kubernetes.io/projected/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-kube-api-access-vtp94\") pod \"keystone-bootstrap-rmddm\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.512992 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6t8ds"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.539921 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.544205 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-52ghn" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.553529 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.554133 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-combined-ca-bundle\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.554271 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d68c\" (UniqueName: \"kubernetes.io/projected/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-kube-api-access-2d68c\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.554412 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-config-data\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.554593 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-combined-ca-bundle\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.554682 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4d8g\" (UniqueName: \"kubernetes.io/projected/727a4f99-6a27-4d95-a73f-92e9fb4b0500-kube-api-access-n4d8g\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.554838 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-config\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.555123 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.582047 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-h6lds"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.587203 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.591562 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4cbkh" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.597493 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.624772 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-6wfdt"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.637154 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.644668 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6t8ds"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.656899 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-config-data\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.655462 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-h6lds"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657027 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-combined-ca-bundle\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657086 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4d8g\" (UniqueName: \"kubernetes.io/projected/727a4f99-6a27-4d95-a73f-92e9fb4b0500-kube-api-access-n4d8g\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657122 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-combined-ca-bundle\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657197 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-config-data\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657229 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-config\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657309 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-scripts\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657347 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-combined-ca-bundle\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657370 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75hj8\" (UniqueName: \"kubernetes.io/projected/c702f83c-0fa3-4ba7-b525-5eddf84355a8-kube-api-access-75hj8\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657391 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c702f83c-0fa3-4ba7-b525-5eddf84355a8-logs\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.657416 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d68c\" (UniqueName: \"kubernetes.io/projected/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-kube-api-access-2d68c\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.663925 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-config-data\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.670038 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-whmvk"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.671952 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.681579 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-combined-ca-bundle\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.685519 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-combined-ca-bundle\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.686329 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d68c\" (UniqueName: \"kubernetes.io/projected/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-kube-api-access-2d68c\") pod \"heat-db-sync-xp7h4\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.715515 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4d8g\" (UniqueName: \"kubernetes.io/projected/727a4f99-6a27-4d95-a73f-92e9fb4b0500-kube-api-access-n4d8g\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.716758 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xp7h4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.717326 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-config\") pod \"neutron-db-sync-4czgq\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.724301 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6qqjx"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.725785 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.763145 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-combined-ca-bundle\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.763331 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-config-data\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.763440 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-combined-ca-bundle\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.763514 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-scripts\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.763588 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-487jg\" (UniqueName: \"kubernetes.io/projected/d0bdf1c8-9578-4295-a835-432049516d07-kube-api-access-487jg\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.763630 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75hj8\" (UniqueName: \"kubernetes.io/projected/c702f83c-0fa3-4ba7-b525-5eddf84355a8-kube-api-access-75hj8\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.763670 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c702f83c-0fa3-4ba7-b525-5eddf84355a8-logs\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.766380 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-db-sync-config-data\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.767057 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c702f83c-0fa3-4ba7-b525-5eddf84355a8-logs\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.768453 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.768677 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tdkk4" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.768936 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.772821 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-combined-ca-bundle\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.774421 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-whmvk"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.776300 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.780142 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-scripts\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.788471 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-config-data\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.801857 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6qqjx"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869265 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-config\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869487 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8s8\" (UniqueName: \"kubernetes.io/projected/d7252936-ed87-47f7-b392-7d8fe8388279-kube-api-access-qj8s8\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869550 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-db-sync-config-data\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869757 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7252936-ed87-47f7-b392-7d8fe8388279-etc-machine-id\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869796 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-scripts\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869826 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-db-sync-config-data\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869871 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869912 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-config-data\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869946 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zsnw\" (UniqueName: \"kubernetes.io/projected/55303a34-d76e-40fa-ba51-676df9fb7104-kube-api-access-7zsnw\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.869973 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.870010 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-combined-ca-bundle\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.870042 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-combined-ca-bundle\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.870065 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.870082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-487jg\" (UniqueName: \"kubernetes.io/projected/d0bdf1c8-9578-4295-a835-432049516d07-kube-api-access-487jg\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.871002 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75hj8\" (UniqueName: \"kubernetes.io/projected/c702f83c-0fa3-4ba7-b525-5eddf84355a8-kube-api-access-75hj8\") pod \"placement-db-sync-6t8ds\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.894260 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-combined-ca-bundle\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.897372 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-db-sync-config-data\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.904856 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.911055 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.912299 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.916039 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.916405 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.939805 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t8ds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.959378 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-487jg\" (UniqueName: \"kubernetes.io/projected/d0bdf1c8-9578-4295-a835-432049516d07-kube-api-access-487jg\") pod \"barbican-db-sync-h6lds\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.980628 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-combined-ca-bundle\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.980700 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.980760 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-config\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.980798 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8s8\" (UniqueName: \"kubernetes.io/projected/d7252936-ed87-47f7-b392-7d8fe8388279-kube-api-access-qj8s8\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.980869 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-db-sync-config-data\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.980949 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7252936-ed87-47f7-b392-7d8fe8388279-etc-machine-id\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.981011 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-scripts\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.981089 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.981260 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-config-data\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.981297 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zsnw\" (UniqueName: \"kubernetes.io/projected/55303a34-d76e-40fa-ba51-676df9fb7104-kube-api-access-7zsnw\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.981340 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.982694 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.982937 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7252936-ed87-47f7-b392-7d8fe8388279-etc-machine-id\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.984127 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.984403 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-config\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.985012 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.988547 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-scripts\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.992125 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:37:15 crc kubenswrapper[4878]: I1202 18:37:15.992610 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-config-data\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.007007 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-combined-ca-bundle\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.008645 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-db-sync-config-data\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.027262 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8s8\" (UniqueName: \"kubernetes.io/projected/d7252936-ed87-47f7-b392-7d8fe8388279-kube-api-access-qj8s8\") pod \"cinder-db-sync-6qqjx\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.029850 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zsnw\" (UniqueName: \"kubernetes.io/projected/55303a34-d76e-40fa-ba51-676df9fb7104-kube-api-access-7zsnw\") pod \"dnsmasq-dns-68dcc9cf6f-whmvk\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.084078 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-scripts\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.084177 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-config-data\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.087650 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.087999 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-run-httpd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.088369 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-log-httpd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.088425 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gklnd\" (UniqueName: \"kubernetes.io/projected/7df302e4-7d89-4c00-b517-d4dce032ad3d-kube-api-access-gklnd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.088480 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.191755 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-log-httpd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.191804 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gklnd\" (UniqueName: \"kubernetes.io/projected/7df302e4-7d89-4c00-b517-d4dce032ad3d-kube-api-access-gklnd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.191839 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.191872 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-scripts\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.191907 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-config-data\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.191972 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.191999 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-run-httpd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.192677 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-run-httpd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.192907 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-log-httpd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.215499 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-config-data\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.220320 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gklnd\" (UniqueName: \"kubernetes.io/projected/7df302e4-7d89-4c00-b517-d4dce032ad3d-kube-api-access-gklnd\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.235460 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.243701 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.244430 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6lds" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.249490 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.267172 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-scripts\") pod \"ceilometer-0\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.303319 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.320935 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.461596 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-6wfdt"] Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.819642 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rmddm"] Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.821494 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" event={"ID":"27c73cdb-bd06-4b13-afe5-c486618cb0ec","Type":"ContainerStarted","Data":"85fc8c59e31382d9d10b07010246ccf36d2571b831a8a6dc58e6200047403187"} Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.859448 4878 generic.go:334] "Generic (PLEG): container finished" podID="80be1dfe-9477-4d5c-9b11-2664d4300eca" containerID="760a6d4bab1115e1e6014d030b304a1edb7366161ac331026a9c0bc9c0bcf7d5" exitCode=0 Dec 02 18:37:16 crc kubenswrapper[4878]: I1202 18:37:16.859544 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80be1dfe-9477-4d5c-9b11-2664d4300eca","Type":"ContainerDied","Data":"760a6d4bab1115e1e6014d030b304a1edb7366161ac331026a9c0bc9c0bcf7d5"} Dec 02 18:37:17 crc kubenswrapper[4878]: E1202 18:37:17.155174 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80be1dfe_9477_4d5c_9b11_2664d4300eca.slice/crio-conmon-760a6d4bab1115e1e6014d030b304a1edb7366161ac331026a9c0bc9c0bcf7d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80be1dfe_9477_4d5c_9b11_2664d4300eca.slice/crio-760a6d4bab1115e1e6014d030b304a1edb7366161ac331026a9c0bc9c0bcf7d5.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.202360 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4czgq"] Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.218067 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xp7h4"] Dec 02 18:37:17 crc kubenswrapper[4878]: W1202 18:37:17.424908 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc702f83c_0fa3_4ba7_b525_5eddf84355a8.slice/crio-6909d837f3c1286ad4dec02a22c83cfe379ca9fc7ef454fa8abc821168da2e20 WatchSource:0}: Error finding container 6909d837f3c1286ad4dec02a22c83cfe379ca9fc7ef454fa8abc821168da2e20: Status 404 returned error can't find the container with id 6909d837f3c1286ad4dec02a22c83cfe379ca9fc7ef454fa8abc821168da2e20 Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.430248 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6t8ds"] Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.797724 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:37:17 crc kubenswrapper[4878]: W1202 18:37:17.820135 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df302e4_7d89_4c00_b517_d4dce032ad3d.slice/crio-a3f3846b2d02998ab6a2221ad9cc0e2923ae9144ccb90a65819494e0c3548ff5 WatchSource:0}: Error finding container a3f3846b2d02998ab6a2221ad9cc0e2923ae9144ccb90a65819494e0c3548ff5: Status 404 returned error can't find the container with id a3f3846b2d02998ab6a2221ad9cc0e2923ae9144ccb90a65819494e0c3548ff5 Dec 02 18:37:17 crc kubenswrapper[4878]: W1202 18:37:17.840319 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7252936_ed87_47f7_b392_7d8fe8388279.slice/crio-9a58340230eea76d23d52a489584d0ccb67c644ac84b3e80ddc20343bf8a6867 WatchSource:0}: Error finding container 9a58340230eea76d23d52a489584d0ccb67c644ac84b3e80ddc20343bf8a6867: Status 404 returned error can't find the container with id 9a58340230eea76d23d52a489584d0ccb67c644ac84b3e80ddc20343bf8a6867 Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.868847 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-h6lds"] Dec 02 18:37:17 crc kubenswrapper[4878]: W1202 18:37:17.888779 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55303a34_d76e_40fa_ba51_676df9fb7104.slice/crio-e91b26d9c5b7e8209fe04b079c9301eda88e7576fb15314d893a028b1849f859 WatchSource:0}: Error finding container e91b26d9c5b7e8209fe04b079c9301eda88e7576fb15314d893a028b1849f859: Status 404 returned error can't find the container with id e91b26d9c5b7e8209fe04b079c9301eda88e7576fb15314d893a028b1849f859 Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.891139 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6qqjx"] Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.910113 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6lds" event={"ID":"d0bdf1c8-9578-4295-a835-432049516d07","Type":"ContainerStarted","Data":"159b5b8146e0791c32c3f5feb11ee158eb4cb2f250ea5d55123b2da8e28c9f4e"} Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.912118 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-whmvk"] Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.915584 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerStarted","Data":"a3f3846b2d02998ab6a2221ad9cc0e2923ae9144ccb90a65819494e0c3548ff5"} Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.951807 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80be1dfe-9477-4d5c-9b11-2664d4300eca","Type":"ContainerStarted","Data":"b0a38c66902470520124427b0be13b0dc7a09e8f5ef68a8ee7d5ecd1a8e24218"} Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.970663 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmddm" event={"ID":"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b","Type":"ContainerStarted","Data":"2d8668b3bbd2fa38c99cd63d64d7d8f0f8e78babae8c5fa00092354dc02cb0ef"} Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.970723 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmddm" event={"ID":"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b","Type":"ContainerStarted","Data":"7408b6f703622412407db312a18bfcc555dd6c4e60a78f568cdabfa643f59842"} Dec 02 18:37:17 crc kubenswrapper[4878]: I1202 18:37:17.973907 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qqjx" event={"ID":"d7252936-ed87-47f7-b392-7d8fe8388279","Type":"ContainerStarted","Data":"9a58340230eea76d23d52a489584d0ccb67c644ac84b3e80ddc20343bf8a6867"} Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.012785 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rmddm" podStartSLOduration=3.012760424 podStartE2EDuration="3.012760424s" podCreationTimestamp="2025-12-02 18:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:18.006084727 +0000 UTC m=+1347.695703618" watchObservedRunningTime="2025-12-02 18:37:18.012760424 +0000 UTC m=+1347.702379305" Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.017067 4878 generic.go:334] "Generic (PLEG): container finished" podID="27c73cdb-bd06-4b13-afe5-c486618cb0ec" containerID="1dda6a125409f3a8618c20247841706343f166bcd371f40dee63cc6485be3b69" exitCode=0 Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.017195 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" event={"ID":"27c73cdb-bd06-4b13-afe5-c486618cb0ec","Type":"ContainerDied","Data":"1dda6a125409f3a8618c20247841706343f166bcd371f40dee63cc6485be3b69"} Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.049612 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xp7h4" event={"ID":"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b","Type":"ContainerStarted","Data":"742842d3ca7ed959edc210840d5427b52003e9aa17aa11f4a88acebbef947a8e"} Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.074845 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t8ds" event={"ID":"c702f83c-0fa3-4ba7-b525-5eddf84355a8","Type":"ContainerStarted","Data":"6909d837f3c1286ad4dec02a22c83cfe379ca9fc7ef454fa8abc821168da2e20"} Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.079832 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4czgq" event={"ID":"727a4f99-6a27-4d95-a73f-92e9fb4b0500","Type":"ContainerStarted","Data":"0d570307ab468142cf95b07cf60a044645f60c831a219c7023be724b3072c59c"} Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.079879 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4czgq" event={"ID":"727a4f99-6a27-4d95-a73f-92e9fb4b0500","Type":"ContainerStarted","Data":"b7328a0b0d4ba09dbd3a77922b95229b498d3950d53e6f2620c044a5358fd08e"} Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.112448 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4czgq" podStartSLOduration=3.112257 podStartE2EDuration="3.112257s" podCreationTimestamp="2025-12-02 18:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:18.103757226 +0000 UTC m=+1347.793376117" watchObservedRunningTime="2025-12-02 18:37:18.112257 +0000 UTC m=+1347.801875871" Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.541537 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.705296 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.865785 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szcbt\" (UniqueName: \"kubernetes.io/projected/27c73cdb-bd06-4b13-afe5-c486618cb0ec-kube-api-access-szcbt\") pod \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.866005 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-nb\") pod \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.866030 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-sb\") pod \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.866196 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-dns-svc\") pod \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.866329 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-config\") pod \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\" (UID: \"27c73cdb-bd06-4b13-afe5-c486618cb0ec\") " Dec 02 18:37:18 crc kubenswrapper[4878]: I1202 18:37:18.889653 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c73cdb-bd06-4b13-afe5-c486618cb0ec-kube-api-access-szcbt" (OuterVolumeSpecName: "kube-api-access-szcbt") pod "27c73cdb-bd06-4b13-afe5-c486618cb0ec" (UID: "27c73cdb-bd06-4b13-afe5-c486618cb0ec"). InnerVolumeSpecName "kube-api-access-szcbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.016219 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27c73cdb-bd06-4b13-afe5-c486618cb0ec" (UID: "27c73cdb-bd06-4b13-afe5-c486618cb0ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.171887 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.171926 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szcbt\" (UniqueName: \"kubernetes.io/projected/27c73cdb-bd06-4b13-afe5-c486618cb0ec-kube-api-access-szcbt\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.180035 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27c73cdb-bd06-4b13-afe5-c486618cb0ec" (UID: "27c73cdb-bd06-4b13-afe5-c486618cb0ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.187896 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-config" (OuterVolumeSpecName: "config") pod "27c73cdb-bd06-4b13-afe5-c486618cb0ec" (UID: "27c73cdb-bd06-4b13-afe5-c486618cb0ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.195943 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.196493 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27c73cdb-bd06-4b13-afe5-c486618cb0ec" (UID: "27c73cdb-bd06-4b13-afe5-c486618cb0ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.197084 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-6wfdt" event={"ID":"27c73cdb-bd06-4b13-afe5-c486618cb0ec","Type":"ContainerDied","Data":"85fc8c59e31382d9d10b07010246ccf36d2571b831a8a6dc58e6200047403187"} Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.197143 4878 scope.go:117] "RemoveContainer" containerID="1dda6a125409f3a8618c20247841706343f166bcd371f40dee63cc6485be3b69" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.217568 4878 generic.go:334] "Generic (PLEG): container finished" podID="55303a34-d76e-40fa-ba51-676df9fb7104" containerID="17ae2f3b6e85b687c0825e4dc3eb6924cf6d2f104c8c522ecfd3364fde9c8516" exitCode=0 Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.219069 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" event={"ID":"55303a34-d76e-40fa-ba51-676df9fb7104","Type":"ContainerDied","Data":"17ae2f3b6e85b687c0825e4dc3eb6924cf6d2f104c8c522ecfd3364fde9c8516"} Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.219142 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" event={"ID":"55303a34-d76e-40fa-ba51-676df9fb7104","Type":"ContainerStarted","Data":"e91b26d9c5b7e8209fe04b079c9301eda88e7576fb15314d893a028b1849f859"} Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.288435 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.288877 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.289052 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27c73cdb-bd06-4b13-afe5-c486618cb0ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.917349 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-6wfdt"] Dec 02 18:37:19 crc kubenswrapper[4878]: I1202 18:37:19.939721 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-6wfdt"] Dec 02 18:37:20 crc kubenswrapper[4878]: I1202 18:37:20.240442 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:21 crc kubenswrapper[4878]: I1202 18:37:20.995221 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27c73cdb-bd06-4b13-afe5-c486618cb0ec" path="/var/lib/kubelet/pods/27c73cdb-bd06-4b13-afe5-c486618cb0ec/volumes" Dec 02 18:37:21 crc kubenswrapper[4878]: I1202 18:37:20.997467 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" podStartSLOduration=5.997436027 podStartE2EDuration="5.997436027s" podCreationTimestamp="2025-12-02 18:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:20.263281955 +0000 UTC m=+1349.952900836" watchObservedRunningTime="2025-12-02 18:37:20.997436027 +0000 UTC m=+1350.687054908" Dec 02 18:37:21 crc kubenswrapper[4878]: I1202 18:37:21.258580 4878 generic.go:334] "Generic (PLEG): container finished" podID="9c0ade7f-399e-4b2f-a250-5f6a47e90baf" containerID="2c35cc5e2f37dc948eefb0fac2bc88ed1108661dde6f89516e1c5449e16ade97" exitCode=0 Dec 02 18:37:21 crc kubenswrapper[4878]: I1202 18:37:21.258680 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wkmpm" event={"ID":"9c0ade7f-399e-4b2f-a250-5f6a47e90baf","Type":"ContainerDied","Data":"2c35cc5e2f37dc948eefb0fac2bc88ed1108661dde6f89516e1c5449e16ade97"} Dec 02 18:37:21 crc kubenswrapper[4878]: I1202 18:37:21.264815 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" event={"ID":"55303a34-d76e-40fa-ba51-676df9fb7104","Type":"ContainerStarted","Data":"c33f5928aa488145698a379f9b796661efbdf9a4b18f323276eb42f4a8997ab3"} Dec 02 18:37:21 crc kubenswrapper[4878]: I1202 18:37:21.279489 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80be1dfe-9477-4d5c-9b11-2664d4300eca","Type":"ContainerStarted","Data":"9002791b435e9cc284381f7b5996b725eceb1552fb6829d976715cbd337b57ef"} Dec 02 18:37:22 crc kubenswrapper[4878]: I1202 18:37:22.310075 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80be1dfe-9477-4d5c-9b11-2664d4300eca","Type":"ContainerStarted","Data":"12499ebfd890e85a5b1b817f9a478b52d261d8b115a2b2dbbcd352ec5ec4eae9"} Dec 02 18:37:22 crc kubenswrapper[4878]: I1202 18:37:22.349674 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.349643532 podStartE2EDuration="18.349643532s" podCreationTimestamp="2025-12-02 18:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:22.341743477 +0000 UTC m=+1352.031362358" watchObservedRunningTime="2025-12-02 18:37:22.349643532 +0000 UTC m=+1352.039262413" Dec 02 18:37:23 crc kubenswrapper[4878]: I1202 18:37:23.327125 4878 generic.go:334] "Generic (PLEG): container finished" podID="6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" containerID="2d8668b3bbd2fa38c99cd63d64d7d8f0f8e78babae8c5fa00092354dc02cb0ef" exitCode=0 Dec 02 18:37:23 crc kubenswrapper[4878]: I1202 18:37:23.328530 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmddm" event={"ID":"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b","Type":"ContainerDied","Data":"2d8668b3bbd2fa38c99cd63d64d7d8f0f8e78babae8c5fa00092354dc02cb0ef"} Dec 02 18:37:24 crc kubenswrapper[4878]: I1202 18:37:24.972562 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.599753 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wkmpm" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.614079 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8twdm\" (UniqueName: \"kubernetes.io/projected/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-kube-api-access-8twdm\") pod \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.614288 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-config-data\") pod \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.614396 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-combined-ca-bundle\") pod \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.618109 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-db-sync-config-data\") pod \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\" (UID: \"9c0ade7f-399e-4b2f-a250-5f6a47e90baf\") " Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.624310 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-kube-api-access-8twdm" (OuterVolumeSpecName: "kube-api-access-8twdm") pod "9c0ade7f-399e-4b2f-a250-5f6a47e90baf" (UID: "9c0ade7f-399e-4b2f-a250-5f6a47e90baf"). InnerVolumeSpecName "kube-api-access-8twdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.627763 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9c0ade7f-399e-4b2f-a250-5f6a47e90baf" (UID: "9c0ade7f-399e-4b2f-a250-5f6a47e90baf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.660669 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c0ade7f-399e-4b2f-a250-5f6a47e90baf" (UID: "9c0ade7f-399e-4b2f-a250-5f6a47e90baf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.696293 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-config-data" (OuterVolumeSpecName: "config-data") pod "9c0ade7f-399e-4b2f-a250-5f6a47e90baf" (UID: "9c0ade7f-399e-4b2f-a250-5f6a47e90baf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.723662 4878 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.723699 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8twdm\" (UniqueName: \"kubernetes.io/projected/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-kube-api-access-8twdm\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.723709 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:25 crc kubenswrapper[4878]: I1202 18:37:25.723719 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ade7f-399e-4b2f-a250-5f6a47e90baf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.246451 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.334189 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jdgcc"] Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.334515 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-jdgcc" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" containerID="cri-o://2055b89dc9cbea43acc6ba991b937ee221b33c2ada58b5a3e0dc3d18f0d16827" gracePeriod=10 Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.402519 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmddm" event={"ID":"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b","Type":"ContainerDied","Data":"7408b6f703622412407db312a18bfcc555dd6c4e60a78f568cdabfa643f59842"} Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.402569 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7408b6f703622412407db312a18bfcc555dd6c4e60a78f568cdabfa643f59842" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.403638 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wkmpm" event={"ID":"9c0ade7f-399e-4b2f-a250-5f6a47e90baf","Type":"ContainerDied","Data":"51d1f619ed5d20c72019b5bc1564fa59e29ca01859f42592248b0c02ac6ec962"} Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.403691 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51d1f619ed5d20c72019b5bc1564fa59e29ca01859f42592248b0c02ac6ec962" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.403770 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wkmpm" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.431437 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.441343 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtp94\" (UniqueName: \"kubernetes.io/projected/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-kube-api-access-vtp94\") pod \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.441428 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-combined-ca-bundle\") pod \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.441459 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-fernet-keys\") pod \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.441513 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-credential-keys\") pod \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.441547 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-scripts\") pod \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.441642 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-config-data\") pod \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\" (UID: \"6604dfbe-00f3-46f2-acc8-2edeb63f9a1b\") " Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.454353 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-scripts" (OuterVolumeSpecName: "scripts") pod "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" (UID: "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.454437 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" (UID: "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.454507 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-kube-api-access-vtp94" (OuterVolumeSpecName: "kube-api-access-vtp94") pod "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" (UID: "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b"). InnerVolumeSpecName "kube-api-access-vtp94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.454587 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" (UID: "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.527452 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" (UID: "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.542619 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-config-data" (OuterVolumeSpecName: "config-data") pod "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" (UID: "6604dfbe-00f3-46f2-acc8-2edeb63f9a1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.545736 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtp94\" (UniqueName: \"kubernetes.io/projected/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-kube-api-access-vtp94\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.545772 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.545781 4878 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.545791 4878 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.545800 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:26 crc kubenswrapper[4878]: I1202 18:37:26.545808 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.055490 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-x76bt"] Dec 02 18:37:27 crc kubenswrapper[4878]: E1202 18:37:27.056713 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" containerName="keystone-bootstrap" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.056740 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" containerName="keystone-bootstrap" Dec 02 18:37:27 crc kubenswrapper[4878]: E1202 18:37:27.056785 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0ade7f-399e-4b2f-a250-5f6a47e90baf" containerName="glance-db-sync" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.056794 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0ade7f-399e-4b2f-a250-5f6a47e90baf" containerName="glance-db-sync" Dec 02 18:37:27 crc kubenswrapper[4878]: E1202 18:37:27.056816 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c73cdb-bd06-4b13-afe5-c486618cb0ec" containerName="init" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.056825 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c73cdb-bd06-4b13-afe5-c486618cb0ec" containerName="init" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.057085 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0ade7f-399e-4b2f-a250-5f6a47e90baf" containerName="glance-db-sync" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.057101 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" containerName="keystone-bootstrap" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.057113 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c73cdb-bd06-4b13-afe5-c486618cb0ec" containerName="init" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.058628 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.066814 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-x76bt"] Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.161145 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx699\" (UniqueName: \"kubernetes.io/projected/df24c2f4-f414-4ece-8526-5335ffea6c9a-kube-api-access-nx699\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.161279 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.161334 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-config\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.161378 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.161651 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-dns-svc\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.264474 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-dns-svc\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.264570 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx699\" (UniqueName: \"kubernetes.io/projected/df24c2f4-f414-4ece-8526-5335ffea6c9a-kube-api-access-nx699\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.264644 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.264669 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-config\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.264707 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.265626 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.266085 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-dns-svc\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.266180 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.267008 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-config\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.308272 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx699\" (UniqueName: \"kubernetes.io/projected/df24c2f4-f414-4ece-8526-5335ffea6c9a-kube-api-access-nx699\") pod \"dnsmasq-dns-f84976bdf-x76bt\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.409582 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.436316 4878 generic.go:334] "Generic (PLEG): container finished" podID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerID="2055b89dc9cbea43acc6ba991b937ee221b33c2ada58b5a3e0dc3d18f0d16827" exitCode=0 Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.436409 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmddm" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.437462 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jdgcc" event={"ID":"d37e6b48-32b3-4471-aa2e-4893d1a7c329","Type":"ContainerDied","Data":"2055b89dc9cbea43acc6ba991b937ee221b33c2ada58b5a3e0dc3d18f0d16827"} Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.645337 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rmddm"] Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.657113 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rmddm"] Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.734880 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x42mc"] Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.736862 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.740099 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.740746 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.741036 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.741473 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.746259 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jjqdr" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.780849 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x42mc"] Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.885113 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-credential-keys\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.885394 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-scripts\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.885629 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-config-data\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.885753 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-combined-ca-bundle\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.885819 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-fernet-keys\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.885996 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sctn2\" (UniqueName: \"kubernetes.io/projected/670fbeeb-ca87-4024-b8d5-ddd470241386-kube-api-access-sctn2\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.990483 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-credential-keys\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.990564 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-scripts\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.990635 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-config-data\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.990678 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-combined-ca-bundle\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.990701 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-fernet-keys\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.990733 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sctn2\" (UniqueName: \"kubernetes.io/projected/670fbeeb-ca87-4024-b8d5-ddd470241386-kube-api-access-sctn2\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:27 crc kubenswrapper[4878]: I1202 18:37:27.997180 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-scripts\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.004889 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-fernet-keys\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.011463 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-credential-keys\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.011554 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-config-data\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.014783 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sctn2\" (UniqueName: \"kubernetes.io/projected/670fbeeb-ca87-4024-b8d5-ddd470241386-kube-api-access-sctn2\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.014839 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-combined-ca-bundle\") pod \"keystone-bootstrap-x42mc\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.068581 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.095455 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.097718 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.102210 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.103139 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.103336 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-h4ktf" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.112856 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.199635 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.199904 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-logs\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.199938 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-scripts\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.200030 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dkv\" (UniqueName: \"kubernetes.io/projected/c847569d-3cae-43c6-94df-5032c1b52ed1-kube-api-access-w8dkv\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.200063 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.200105 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-config-data\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.200145 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.276594 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.278736 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.281634 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.312005 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.323136 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-config-data\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.326932 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.327218 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.327346 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-logs\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.327449 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-scripts\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.327535 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.327783 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dkv\" (UniqueName: \"kubernetes.io/projected/c847569d-3cae-43c6-94df-5032c1b52ed1-kube-api-access-w8dkv\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.327916 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.328479 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.328618 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-logs\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.332345 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-config-data\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.335568 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-scripts\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.347407 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.355887 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dkv\" (UniqueName: \"kubernetes.io/projected/c847569d-3cae-43c6-94df-5032c1b52ed1-kube-api-access-w8dkv\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.367836 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.430064 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.430111 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.430169 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-logs\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.430190 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnw4v\" (UniqueName: \"kubernetes.io/projected/4990eb3f-e7c0-459e-92d0-4774326e8f51-kube-api-access-jnw4v\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.430469 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.430705 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.430870 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.482546 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.534676 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.534767 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.534854 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.534880 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.534924 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-logs\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.534942 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnw4v\" (UniqueName: \"kubernetes.io/projected/4990eb3f-e7c0-459e-92d0-4774326e8f51-kube-api-access-jnw4v\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.534993 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.535155 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.535291 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.535683 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-logs\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.539906 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.540086 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.551694 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.555703 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnw4v\" (UniqueName: \"kubernetes.io/projected/4990eb3f-e7c0-459e-92d0-4774326e8f51-kube-api-access-jnw4v\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.568098 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.605222 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:37:28 crc kubenswrapper[4878]: I1202 18:37:28.953735 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6604dfbe-00f3-46f2-acc8-2edeb63f9a1b" path="/var/lib/kubelet/pods/6604dfbe-00f3-46f2-acc8-2edeb63f9a1b/volumes" Dec 02 18:37:30 crc kubenswrapper[4878]: I1202 18:37:30.714269 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:37:30 crc kubenswrapper[4878]: I1202 18:37:30.831284 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:37:30 crc kubenswrapper[4878]: I1202 18:37:30.887174 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jdgcc" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Dec 02 18:37:31 crc kubenswrapper[4878]: I1202 18:37:31.130758 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:37:31 crc kubenswrapper[4878]: I1202 18:37:31.140819 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f-etc-swift\") pod \"swift-storage-0\" (UID: \"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f\") " pod="openstack/swift-storage-0" Dec 02 18:37:31 crc kubenswrapper[4878]: I1202 18:37:31.292535 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 18:37:34 crc kubenswrapper[4878]: I1202 18:37:34.972942 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:34 crc kubenswrapper[4878]: I1202 18:37:34.982487 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:35 crc kubenswrapper[4878]: I1202 18:37:35.539145 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 18:37:35 crc kubenswrapper[4878]: I1202 18:37:35.885300 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jdgcc" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Dec 02 18:37:39 crc kubenswrapper[4878]: I1202 18:37:39.585878 4878 generic.go:334] "Generic (PLEG): container finished" podID="727a4f99-6a27-4d95-a73f-92e9fb4b0500" containerID="0d570307ab468142cf95b07cf60a044645f60c831a219c7023be724b3072c59c" exitCode=0 Dec 02 18:37:39 crc kubenswrapper[4878]: I1202 18:37:39.585988 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4czgq" event={"ID":"727a4f99-6a27-4d95-a73f-92e9fb4b0500","Type":"ContainerDied","Data":"0d570307ab468142cf95b07cf60a044645f60c831a219c7023be724b3072c59c"} Dec 02 18:37:45 crc kubenswrapper[4878]: I1202 18:37:45.886310 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jdgcc" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: i/o timeout" Dec 02 18:37:45 crc kubenswrapper[4878]: I1202 18:37:45.887171 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:37:47 crc kubenswrapper[4878]: E1202 18:37:47.238583 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 02 18:37:47 crc kubenswrapper[4878]: E1202 18:37:47.239332 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n565h594hb5h644h58fhf9h57fhd6h59dh5dfh55dh5b9h56bhb4h94h656h5bdh697h5dfh59ch557h6fh566h684h77h65hb5hbh546h5c6h95h8bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gklnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7df302e4-7d89-4c00-b517-d4dce032ad3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:37:47 crc kubenswrapper[4878]: E1202 18:37:47.567781 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 02 18:37:47 crc kubenswrapper[4878]: E1202 18:37:47.568065 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2d68c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-xp7h4_openstack(faf8fbcd-b97a-45f4-8b17-c92fdc87d75b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:37:47 crc kubenswrapper[4878]: E1202 18:37:47.569331 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-xp7h4" podUID="faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.691611 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4czgq" event={"ID":"727a4f99-6a27-4d95-a73f-92e9fb4b0500","Type":"ContainerDied","Data":"b7328a0b0d4ba09dbd3a77922b95229b498d3950d53e6f2620c044a5358fd08e"} Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.691677 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7328a0b0d4ba09dbd3a77922b95229b498d3950d53e6f2620c044a5358fd08e" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.695703 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jdgcc" event={"ID":"d37e6b48-32b3-4471-aa2e-4893d1a7c329","Type":"ContainerDied","Data":"6336d50995211c2bf83cd1a43232bc8d16b7bd8dd32db6a31a318edd60d42961"} Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.695808 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6336d50995211c2bf83cd1a43232bc8d16b7bd8dd32db6a31a318edd60d42961" Dec 02 18:37:47 crc kubenswrapper[4878]: E1202 18:37:47.698561 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-xp7h4" podUID="faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.758695 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.768475 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785157 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp82h\" (UniqueName: \"kubernetes.io/projected/d37e6b48-32b3-4471-aa2e-4893d1a7c329-kube-api-access-sp82h\") pod \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785264 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-nb\") pod \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785318 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-dns-svc\") pod \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785384 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4d8g\" (UniqueName: \"kubernetes.io/projected/727a4f99-6a27-4d95-a73f-92e9fb4b0500-kube-api-access-n4d8g\") pod \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785422 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-config\") pod \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785465 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-combined-ca-bundle\") pod \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785505 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-sb\") pod \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\" (UID: \"d37e6b48-32b3-4471-aa2e-4893d1a7c329\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.785529 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-config\") pod \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\" (UID: \"727a4f99-6a27-4d95-a73f-92e9fb4b0500\") " Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.803315 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37e6b48-32b3-4471-aa2e-4893d1a7c329-kube-api-access-sp82h" (OuterVolumeSpecName: "kube-api-access-sp82h") pod "d37e6b48-32b3-4471-aa2e-4893d1a7c329" (UID: "d37e6b48-32b3-4471-aa2e-4893d1a7c329"). InnerVolumeSpecName "kube-api-access-sp82h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.816406 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727a4f99-6a27-4d95-a73f-92e9fb4b0500-kube-api-access-n4d8g" (OuterVolumeSpecName: "kube-api-access-n4d8g") pod "727a4f99-6a27-4d95-a73f-92e9fb4b0500" (UID: "727a4f99-6a27-4d95-a73f-92e9fb4b0500"). InnerVolumeSpecName "kube-api-access-n4d8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.826352 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-config" (OuterVolumeSpecName: "config") pod "727a4f99-6a27-4d95-a73f-92e9fb4b0500" (UID: "727a4f99-6a27-4d95-a73f-92e9fb4b0500"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.857025 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "727a4f99-6a27-4d95-a73f-92e9fb4b0500" (UID: "727a4f99-6a27-4d95-a73f-92e9fb4b0500"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.908331 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-config" (OuterVolumeSpecName: "config") pod "d37e6b48-32b3-4471-aa2e-4893d1a7c329" (UID: "d37e6b48-32b3-4471-aa2e-4893d1a7c329"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.908655 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d37e6b48-32b3-4471-aa2e-4893d1a7c329" (UID: "d37e6b48-32b3-4471-aa2e-4893d1a7c329"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.913770 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d37e6b48-32b3-4471-aa2e-4893d1a7c329" (UID: "d37e6b48-32b3-4471-aa2e-4893d1a7c329"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.916161 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp82h\" (UniqueName: \"kubernetes.io/projected/d37e6b48-32b3-4471-aa2e-4893d1a7c329-kube-api-access-sp82h\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.916189 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.916200 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4d8g\" (UniqueName: \"kubernetes.io/projected/727a4f99-6a27-4d95-a73f-92e9fb4b0500-kube-api-access-n4d8g\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.916213 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.916223 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.916245 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.916255 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/727a4f99-6a27-4d95-a73f-92e9fb4b0500-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:47 crc kubenswrapper[4878]: I1202 18:37:47.918329 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d37e6b48-32b3-4471-aa2e-4893d1a7c329" (UID: "d37e6b48-32b3-4471-aa2e-4893d1a7c329"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:48 crc kubenswrapper[4878]: I1202 18:37:48.019672 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d37e6b48-32b3-4471-aa2e-4893d1a7c329-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:48 crc kubenswrapper[4878]: I1202 18:37:48.181906 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-x76bt"] Dec 02 18:37:48 crc kubenswrapper[4878]: I1202 18:37:48.706105 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jdgcc" Dec 02 18:37:48 crc kubenswrapper[4878]: I1202 18:37:48.706117 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4czgq" Dec 02 18:37:48 crc kubenswrapper[4878]: I1202 18:37:48.756981 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jdgcc"] Dec 02 18:37:48 crc kubenswrapper[4878]: I1202 18:37:48.767056 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jdgcc"] Dec 02 18:37:48 crc kubenswrapper[4878]: I1202 18:37:48.959103 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" path="/var/lib/kubelet/pods/d37e6b48-32b3-4471-aa2e-4893d1a7c329/volumes" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.056897 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-x76bt"] Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.109290 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb745b69-wpj5n"] Dec 02 18:37:49 crc kubenswrapper[4878]: E1202 18:37:49.109837 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="init" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.109857 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="init" Dec 02 18:37:49 crc kubenswrapper[4878]: E1202 18:37:49.109882 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.109889 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" Dec 02 18:37:49 crc kubenswrapper[4878]: E1202 18:37:49.109904 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727a4f99-6a27-4d95-a73f-92e9fb4b0500" containerName="neutron-db-sync" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.109911 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="727a4f99-6a27-4d95-a73f-92e9fb4b0500" containerName="neutron-db-sync" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.110156 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="727a4f99-6a27-4d95-a73f-92e9fb4b0500" containerName="neutron-db-sync" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.110191 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.111770 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.133131 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-wpj5n"] Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.159487 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.159610 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-config\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.159681 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vktx\" (UniqueName: \"kubernetes.io/projected/1ac2d806-c2e2-4d10-9975-e4ae079add40-kube-api-access-8vktx\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.159731 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.159864 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-dns-svc\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.220503 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-749d96bfb4-7zqk4"] Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.227365 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.230248 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.230995 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.232786 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.233029 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dx549" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.239735 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-749d96bfb4-7zqk4"] Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.266334 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.266423 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-config\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.266477 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vktx\" (UniqueName: \"kubernetes.io/projected/1ac2d806-c2e2-4d10-9975-e4ae079add40-kube-api-access-8vktx\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.266515 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.266576 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-dns-svc\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.267674 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-dns-svc\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.268418 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-sb\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.269033 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-config\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.269915 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-nb\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.357439 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vktx\" (UniqueName: \"kubernetes.io/projected/1ac2d806-c2e2-4d10-9975-e4ae079add40-kube-api-access-8vktx\") pod \"dnsmasq-dns-fb745b69-wpj5n\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.369888 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-config\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.370005 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmkmk\" (UniqueName: \"kubernetes.io/projected/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-kube-api-access-dmkmk\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.370118 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-httpd-config\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.370140 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-ovndb-tls-certs\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.370187 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-combined-ca-bundle\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.456784 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.475845 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-httpd-config\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.475898 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-ovndb-tls-certs\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.475935 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-combined-ca-bundle\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.476019 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-config\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.476073 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmkmk\" (UniqueName: \"kubernetes.io/projected/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-kube-api-access-dmkmk\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.480710 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-ovndb-tls-certs\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.486469 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-combined-ca-bundle\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.488606 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-httpd-config\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.504189 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmkmk\" (UniqueName: \"kubernetes.io/projected/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-kube-api-access-dmkmk\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.505313 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-config\") pod \"neutron-749d96bfb4-7zqk4\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:49 crc kubenswrapper[4878]: I1202 18:37:49.569852 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:50 crc kubenswrapper[4878]: I1202 18:37:50.751524 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-x76bt" event={"ID":"df24c2f4-f414-4ece-8526-5335ffea6c9a","Type":"ContainerStarted","Data":"3adc88ee1b7dfb96078a8067283fa4f12f55c083f5e8c7377d3704c02a572ba8"} Dec 02 18:37:50 crc kubenswrapper[4878]: E1202 18:37:50.759503 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 02 18:37:50 crc kubenswrapper[4878]: E1202 18:37:50.759708 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qj8s8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6qqjx_openstack(d7252936-ed87-47f7-b392-7d8fe8388279): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:37:50 crc kubenswrapper[4878]: E1202 18:37:50.760984 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6qqjx" podUID="d7252936-ed87-47f7-b392-7d8fe8388279" Dec 02 18:37:50 crc kubenswrapper[4878]: I1202 18:37:50.890949 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jdgcc" podUID="d37e6b48-32b3-4471-aa2e-4893d1a7c329" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: i/o timeout" Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.753362 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.827477 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-x76bt" event={"ID":"df24c2f4-f414-4ece-8526-5335ffea6c9a","Type":"ContainerDied","Data":"78699985b60c2858784f09615bdbcba00e598028a4e2933a16bd6f855ded3f43"} Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.826742 4878 generic.go:334] "Generic (PLEG): container finished" podID="df24c2f4-f414-4ece-8526-5335ffea6c9a" containerID="78699985b60c2858784f09615bdbcba00e598028a4e2933a16bd6f855ded3f43" exitCode=0 Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.837627 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t8ds" event={"ID":"c702f83c-0fa3-4ba7-b525-5eddf84355a8","Type":"ContainerStarted","Data":"d13e1834896505007437ee3e8dcb1e0c0c9663ed054383ffe0ba2c61aa44ae40"} Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.845863 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6lds" event={"ID":"d0bdf1c8-9578-4295-a835-432049516d07","Type":"ContainerStarted","Data":"0b5dc9ab2161e6b65edb6dc0055e1a8cb2f405a4236c70ece22a0f7aa3ac35c1"} Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.861027 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x42mc"] Dec 02 18:37:51 crc kubenswrapper[4878]: E1202 18:37:51.882281 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6qqjx" podUID="d7252936-ed87-47f7-b392-7d8fe8388279" Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.891923 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6t8ds" podStartSLOduration=6.712823658 podStartE2EDuration="36.891902777s" podCreationTimestamp="2025-12-02 18:37:15 +0000 UTC" firstStartedPulling="2025-12-02 18:37:17.429025716 +0000 UTC m=+1347.118644587" lastFinishedPulling="2025-12-02 18:37:47.608104825 +0000 UTC m=+1377.297723706" observedRunningTime="2025-12-02 18:37:51.880300697 +0000 UTC m=+1381.569919578" watchObservedRunningTime="2025-12-02 18:37:51.891902777 +0000 UTC m=+1381.581521658" Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.899892 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 18:37:51 crc kubenswrapper[4878]: I1202 18:37:51.923358 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-h6lds" podStartSLOduration=7.18580551 podStartE2EDuration="36.923340053s" podCreationTimestamp="2025-12-02 18:37:15 +0000 UTC" firstStartedPulling="2025-12-02 18:37:17.877637562 +0000 UTC m=+1347.567256443" lastFinishedPulling="2025-12-02 18:37:47.615172105 +0000 UTC m=+1377.304790986" observedRunningTime="2025-12-02 18:37:51.916421728 +0000 UTC m=+1381.606040609" watchObservedRunningTime="2025-12-02 18:37:51.923340053 +0000 UTC m=+1381.612958934" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.056355 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-749d96bfb4-7zqk4"] Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.110871 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-wpj5n"] Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.198781 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.222275 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9d785686f-gqxnz"] Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.227451 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.230660 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.233303 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.252350 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d785686f-gqxnz"] Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.261907 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-combined-ca-bundle\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.262277 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-ovndb-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.262374 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-internal-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.262451 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-httpd-config\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.262530 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-config\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.262699 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmx6g\" (UniqueName: \"kubernetes.io/projected/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-kube-api-access-gmx6g\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.262865 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-public-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.365693 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmx6g\" (UniqueName: \"kubernetes.io/projected/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-kube-api-access-gmx6g\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.365805 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-public-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.365843 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-combined-ca-bundle\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.365912 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-ovndb-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.365940 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-internal-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.365963 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-httpd-config\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.365989 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-config\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.373907 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-internal-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.375736 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-ovndb-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.378254 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-httpd-config\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.383150 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-public-tls-certs\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.384977 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-config\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.388009 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-combined-ca-bundle\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.395029 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmx6g\" (UniqueName: \"kubernetes.io/projected/70e70d85-cdcd-43e7-b2c2-dbc6386665e3-kube-api-access-gmx6g\") pod \"neutron-9d785686f-gqxnz\" (UID: \"70e70d85-cdcd-43e7-b2c2-dbc6386665e3\") " pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.574294 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.840301 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 18:37:52 crc kubenswrapper[4878]: W1202 18:37:52.904462 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c2a6cb2_97ad_418b_9132_1d17e3cd9c7f.slice/crio-84933f17e620bf4419c01c81cc4806d60bb1639629b860494e548bf77083723d WatchSource:0}: Error finding container 84933f17e620bf4419c01c81cc4806d60bb1639629b860494e548bf77083723d: Status 404 returned error can't find the container with id 84933f17e620bf4419c01c81cc4806d60bb1639629b860494e548bf77083723d Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.917499 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4990eb3f-e7c0-459e-92d0-4774326e8f51","Type":"ContainerStarted","Data":"2ea3306085b02df416b0d75b24c055df1a69f81e7c958590dfe6d2d1d1c305cc"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.917550 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4990eb3f-e7c0-459e-92d0-4774326e8f51","Type":"ContainerStarted","Data":"3ac14fc94df6b86ac9cbc4b70a4476b9108b91ed83233cf8094b577738da3efb"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.923806 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" event={"ID":"1ac2d806-c2e2-4d10-9975-e4ae079add40","Type":"ContainerStarted","Data":"6525d58fda2d7965c52efa47e580e9686c65f950b8332cd975ed0ab4943f7b24"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.927527 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749d96bfb4-7zqk4" event={"ID":"3f5ffe5f-5be4-4103-864e-56d17ac72a2d","Type":"ContainerStarted","Data":"9d68b6ea93a79d4d01c9cf3afd8074cd911fb5f325f279a559a6f1d6a48ec5b2"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.930324 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c847569d-3cae-43c6-94df-5032c1b52ed1","Type":"ContainerStarted","Data":"40108cf04b3939ab5b78604fafbf5b030379f8d56153960b817a8292c76a6362"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.937452 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x42mc" event={"ID":"670fbeeb-ca87-4024-b8d5-ddd470241386","Type":"ContainerStarted","Data":"2b24a441c9b26e2ace023f0e8d27ed879b706b1209de208ec786997acbf8cbf5"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.976754 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.981282 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x42mc" event={"ID":"670fbeeb-ca87-4024-b8d5-ddd470241386","Type":"ContainerStarted","Data":"2146f2f6ffb43ef11195adfa078d2b97f7353aaaacd55f26cad82b007a470bde"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.981318 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-x76bt" event={"ID":"df24c2f4-f414-4ece-8526-5335ffea6c9a","Type":"ContainerDied","Data":"3adc88ee1b7dfb96078a8067283fa4f12f55c083f5e8c7377d3704c02a572ba8"} Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.981349 4878 scope.go:117] "RemoveContainer" containerID="78699985b60c2858784f09615bdbcba00e598028a4e2933a16bd6f855ded3f43" Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.991285 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-dns-svc\") pod \"df24c2f4-f414-4ece-8526-5335ffea6c9a\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.991379 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-config\") pod \"df24c2f4-f414-4ece-8526-5335ffea6c9a\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.991432 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-nb\") pod \"df24c2f4-f414-4ece-8526-5335ffea6c9a\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.991489 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-sb\") pod \"df24c2f4-f414-4ece-8526-5335ffea6c9a\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " Dec 02 18:37:52 crc kubenswrapper[4878]: I1202 18:37:52.991547 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx699\" (UniqueName: \"kubernetes.io/projected/df24c2f4-f414-4ece-8526-5335ffea6c9a-kube-api-access-nx699\") pod \"df24c2f4-f414-4ece-8526-5335ffea6c9a\" (UID: \"df24c2f4-f414-4ece-8526-5335ffea6c9a\") " Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.007479 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x42mc" podStartSLOduration=26.007457341 podStartE2EDuration="26.007457341s" podCreationTimestamp="2025-12-02 18:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:52.959809924 +0000 UTC m=+1382.649428825" watchObservedRunningTime="2025-12-02 18:37:53.007457341 +0000 UTC m=+1382.697076222" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.014470 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df24c2f4-f414-4ece-8526-5335ffea6c9a-kube-api-access-nx699" (OuterVolumeSpecName: "kube-api-access-nx699") pod "df24c2f4-f414-4ece-8526-5335ffea6c9a" (UID: "df24c2f4-f414-4ece-8526-5335ffea6c9a"). InnerVolumeSpecName "kube-api-access-nx699". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.033567 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-config" (OuterVolumeSpecName: "config") pod "df24c2f4-f414-4ece-8526-5335ffea6c9a" (UID: "df24c2f4-f414-4ece-8526-5335ffea6c9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.034082 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df24c2f4-f414-4ece-8526-5335ffea6c9a" (UID: "df24c2f4-f414-4ece-8526-5335ffea6c9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.040372 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df24c2f4-f414-4ece-8526-5335ffea6c9a" (UID: "df24c2f4-f414-4ece-8526-5335ffea6c9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.086598 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df24c2f4-f414-4ece-8526-5335ffea6c9a" (UID: "df24c2f4-f414-4ece-8526-5335ffea6c9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.097454 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.097490 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.097500 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.097513 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df24c2f4-f414-4ece-8526-5335ffea6c9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.097524 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx699\" (UniqueName: \"kubernetes.io/projected/df24c2f4-f414-4ece-8526-5335ffea6c9a-kube-api-access-nx699\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.427893 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d785686f-gqxnz"] Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.973330 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-x76bt" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.980346 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d785686f-gqxnz" event={"ID":"70e70d85-cdcd-43e7-b2c2-dbc6386665e3","Type":"ContainerStarted","Data":"52e2b6bef4c49b60d3f8ece032e5cc9eb4f7f4c028e745bbea99f86b27d654fc"} Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.980391 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d785686f-gqxnz" event={"ID":"70e70d85-cdcd-43e7-b2c2-dbc6386665e3","Type":"ContainerStarted","Data":"e992de4aef08d1188676bc4db04ec71c8fbe082d3841cc5ea41d838923b5be9b"} Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.984105 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"84933f17e620bf4419c01c81cc4806d60bb1639629b860494e548bf77083723d"} Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.986749 4878 generic.go:334] "Generic (PLEG): container finished" podID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerID="ebde465b711cdac3d58ef51ac9ebe5d9fe0b0bab517f7896b8fecf125c718975" exitCode=0 Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.986830 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" event={"ID":"1ac2d806-c2e2-4d10-9975-e4ae079add40","Type":"ContainerDied","Data":"ebde465b711cdac3d58ef51ac9ebe5d9fe0b0bab517f7896b8fecf125c718975"} Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.989774 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749d96bfb4-7zqk4" event={"ID":"3f5ffe5f-5be4-4103-864e-56d17ac72a2d","Type":"ContainerStarted","Data":"8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf"} Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.989818 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749d96bfb4-7zqk4" event={"ID":"3f5ffe5f-5be4-4103-864e-56d17ac72a2d","Type":"ContainerStarted","Data":"bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f"} Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.991125 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:37:53 crc kubenswrapper[4878]: I1202 18:37:53.992884 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c847569d-3cae-43c6-94df-5032c1b52ed1","Type":"ContainerStarted","Data":"56851e21238cf131ed97f8531c40187665c83a4914791a96262e6a6ea4d54c63"} Dec 02 18:37:54 crc kubenswrapper[4878]: I1202 18:37:54.000073 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerStarted","Data":"f19f8e8a27dedec47681745735d3fe1dc415d5584ee88c8706ebe7751d504659"} Dec 02 18:37:54 crc kubenswrapper[4878]: I1202 18:37:54.080569 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-749d96bfb4-7zqk4" podStartSLOduration=5.080539978 podStartE2EDuration="5.080539978s" podCreationTimestamp="2025-12-02 18:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:54.056999678 +0000 UTC m=+1383.746618559" watchObservedRunningTime="2025-12-02 18:37:54.080539978 +0000 UTC m=+1383.770158859" Dec 02 18:37:54 crc kubenswrapper[4878]: I1202 18:37:54.129143 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-x76bt"] Dec 02 18:37:54 crc kubenswrapper[4878]: I1202 18:37:54.138792 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-x76bt"] Dec 02 18:37:54 crc kubenswrapper[4878]: I1202 18:37:54.953527 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df24c2f4-f414-4ece-8526-5335ffea6c9a" path="/var/lib/kubelet/pods/df24c2f4-f414-4ece-8526-5335ffea6c9a/volumes" Dec 02 18:37:55 crc kubenswrapper[4878]: I1202 18:37:55.013937 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d785686f-gqxnz" event={"ID":"70e70d85-cdcd-43e7-b2c2-dbc6386665e3","Type":"ContainerStarted","Data":"916e3c7744cbe740936e76dc5bca10705d6073b9492c71da41216ffe08f0674f"} Dec 02 18:37:55 crc kubenswrapper[4878]: I1202 18:37:55.014062 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:37:55 crc kubenswrapper[4878]: I1202 18:37:55.017411 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4990eb3f-e7c0-459e-92d0-4774326e8f51","Type":"ContainerStarted","Data":"a2d83c4f21cf0308bf9aef16c8cd4a7f3df3c70e051218112f23f105a8b0e74b"} Dec 02 18:37:55 crc kubenswrapper[4878]: I1202 18:37:55.018286 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-log" containerID="cri-o://2ea3306085b02df416b0d75b24c055df1a69f81e7c958590dfe6d2d1d1c305cc" gracePeriod=30 Dec 02 18:37:55 crc kubenswrapper[4878]: I1202 18:37:55.018455 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-httpd" containerID="cri-o://a2d83c4f21cf0308bf9aef16c8cd4a7f3df3c70e051218112f23f105a8b0e74b" gracePeriod=30 Dec 02 18:37:55 crc kubenswrapper[4878]: I1202 18:37:55.123497 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9d785686f-gqxnz" podStartSLOduration=3.12346612 podStartE2EDuration="3.12346612s" podCreationTimestamp="2025-12-02 18:37:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:55.039703932 +0000 UTC m=+1384.729322823" watchObservedRunningTime="2025-12-02 18:37:55.12346612 +0000 UTC m=+1384.813085001" Dec 02 18:37:55 crc kubenswrapper[4878]: I1202 18:37:55.180707 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=28.180664854 podStartE2EDuration="28.180664854s" podCreationTimestamp="2025-12-02 18:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:55.1218801 +0000 UTC m=+1384.811498981" watchObservedRunningTime="2025-12-02 18:37:55.180664854 +0000 UTC m=+1384.870283735" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.030990 4878 generic.go:334] "Generic (PLEG): container finished" podID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerID="2ea3306085b02df416b0d75b24c055df1a69f81e7c958590dfe6d2d1d1c305cc" exitCode=143 Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.031080 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4990eb3f-e7c0-459e-92d0-4774326e8f51","Type":"ContainerDied","Data":"2ea3306085b02df416b0d75b24c055df1a69f81e7c958590dfe6d2d1d1c305cc"} Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.035206 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" event={"ID":"1ac2d806-c2e2-4d10-9975-e4ae079add40","Type":"ContainerStarted","Data":"1a4e5ade1c42b2d47276d4d0574a7fdce7c94d83c08f4b534aec1430806d1d8b"} Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.194776 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2p669"] Dec 02 18:37:56 crc kubenswrapper[4878]: E1202 18:37:56.197121 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df24c2f4-f414-4ece-8526-5335ffea6c9a" containerName="init" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.197313 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="df24c2f4-f414-4ece-8526-5335ffea6c9a" containerName="init" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.197928 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="df24c2f4-f414-4ece-8526-5335ffea6c9a" containerName="init" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.203595 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2p669"] Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.208544 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.310702 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-utilities\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.311406 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qxtn\" (UniqueName: \"kubernetes.io/projected/c17b3e4f-ede8-45e3-86df-c2a7813744de-kube-api-access-4qxtn\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.311527 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-catalog-content\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.413819 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qxtn\" (UniqueName: \"kubernetes.io/projected/c17b3e4f-ede8-45e3-86df-c2a7813744de-kube-api-access-4qxtn\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.413877 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-catalog-content\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.413990 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-utilities\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.414613 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-utilities\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.414841 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-catalog-content\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.440324 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qxtn\" (UniqueName: \"kubernetes.io/projected/c17b3e4f-ede8-45e3-86df-c2a7813744de-kube-api-access-4qxtn\") pod \"redhat-operators-2p669\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:56 crc kubenswrapper[4878]: I1202 18:37:56.547614 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:37:57 crc kubenswrapper[4878]: I1202 18:37:57.092120 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2p669"] Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.063307 4878 generic.go:334] "Generic (PLEG): container finished" podID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerID="4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6" exitCode=0 Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.063489 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p669" event={"ID":"c17b3e4f-ede8-45e3-86df-c2a7813744de","Type":"ContainerDied","Data":"4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6"} Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.064094 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p669" event={"ID":"c17b3e4f-ede8-45e3-86df-c2a7813744de","Type":"ContainerStarted","Data":"606db30b5c1854302fc7b101c602b8ea300784fb76fb5986284da3c9359d0a5f"} Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.068462 4878 generic.go:334] "Generic (PLEG): container finished" podID="c702f83c-0fa3-4ba7-b525-5eddf84355a8" containerID="d13e1834896505007437ee3e8dcb1e0c0c9663ed054383ffe0ba2c61aa44ae40" exitCode=0 Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.068563 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t8ds" event={"ID":"c702f83c-0fa3-4ba7-b525-5eddf84355a8","Type":"ContainerDied","Data":"d13e1834896505007437ee3e8dcb1e0c0c9663ed054383ffe0ba2c61aa44ae40"} Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.073205 4878 generic.go:334] "Generic (PLEG): container finished" podID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerID="a2d83c4f21cf0308bf9aef16c8cd4a7f3df3c70e051218112f23f105a8b0e74b" exitCode=0 Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.073271 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4990eb3f-e7c0-459e-92d0-4774326e8f51","Type":"ContainerDied","Data":"a2d83c4f21cf0308bf9aef16c8cd4a7f3df3c70e051218112f23f105a8b0e74b"} Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.079022 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-log" containerID="cri-o://56851e21238cf131ed97f8531c40187665c83a4914791a96262e6a6ea4d54c63" gracePeriod=30 Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.079110 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c847569d-3cae-43c6-94df-5032c1b52ed1","Type":"ContainerStarted","Data":"f26f82c91f8c6c269dadbf982eb2cd9d57501c90e745e9dae8a78986ea2f669d"} Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.079137 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.079197 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-httpd" containerID="cri-o://f26f82c91f8c6c269dadbf982eb2cd9d57501c90e745e9dae8a78986ea2f669d" gracePeriod=30 Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.151849 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=31.151824248 podStartE2EDuration="31.151824248s" podCreationTimestamp="2025-12-02 18:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:58.138830696 +0000 UTC m=+1387.828449577" watchObservedRunningTime="2025-12-02 18:37:58.151824248 +0000 UTC m=+1387.841443129" Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.168797 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" podStartSLOduration=9.168777544 podStartE2EDuration="9.168777544s" podCreationTimestamp="2025-12-02 18:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:37:58.162744607 +0000 UTC m=+1387.852363488" watchObservedRunningTime="2025-12-02 18:37:58.168777544 +0000 UTC m=+1387.858396425" Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.482836 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.482888 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.605370 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 18:37:58 crc kubenswrapper[4878]: I1202 18:37:58.605439 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.093101 4878 generic.go:334] "Generic (PLEG): container finished" podID="670fbeeb-ca87-4024-b8d5-ddd470241386" containerID="2b24a441c9b26e2ace023f0e8d27ed879b706b1209de208ec786997acbf8cbf5" exitCode=0 Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.093664 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x42mc" event={"ID":"670fbeeb-ca87-4024-b8d5-ddd470241386","Type":"ContainerDied","Data":"2b24a441c9b26e2ace023f0e8d27ed879b706b1209de208ec786997acbf8cbf5"} Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.097444 4878 generic.go:334] "Generic (PLEG): container finished" podID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerID="f26f82c91f8c6c269dadbf982eb2cd9d57501c90e745e9dae8a78986ea2f669d" exitCode=0 Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.097481 4878 generic.go:334] "Generic (PLEG): container finished" podID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerID="56851e21238cf131ed97f8531c40187665c83a4914791a96262e6a6ea4d54c63" exitCode=143 Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.097509 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c847569d-3cae-43c6-94df-5032c1b52ed1","Type":"ContainerDied","Data":"f26f82c91f8c6c269dadbf982eb2cd9d57501c90e745e9dae8a78986ea2f669d"} Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.097549 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c847569d-3cae-43c6-94df-5032c1b52ed1","Type":"ContainerDied","Data":"56851e21238cf131ed97f8531c40187665c83a4914791a96262e6a6ea4d54c63"} Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.098954 4878 generic.go:334] "Generic (PLEG): container finished" podID="d0bdf1c8-9578-4295-a835-432049516d07" containerID="0b5dc9ab2161e6b65edb6dc0055e1a8cb2f405a4236c70ece22a0f7aa3ac35c1" exitCode=0 Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.099026 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6lds" event={"ID":"d0bdf1c8-9578-4295-a835-432049516d07","Type":"ContainerDied","Data":"0b5dc9ab2161e6b65edb6dc0055e1a8cb2f405a4236c70ece22a0f7aa3ac35c1"} Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.688513 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.819571 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-httpd-run\") pod \"4990eb3f-e7c0-459e-92d0-4774326e8f51\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.819626 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-config-data\") pod \"4990eb3f-e7c0-459e-92d0-4774326e8f51\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.819674 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-combined-ca-bundle\") pod \"4990eb3f-e7c0-459e-92d0-4774326e8f51\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.819701 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-scripts\") pod \"4990eb3f-e7c0-459e-92d0-4774326e8f51\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.819786 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-logs\") pod \"4990eb3f-e7c0-459e-92d0-4774326e8f51\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.819841 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"4990eb3f-e7c0-459e-92d0-4774326e8f51\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.819869 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnw4v\" (UniqueName: \"kubernetes.io/projected/4990eb3f-e7c0-459e-92d0-4774326e8f51-kube-api-access-jnw4v\") pod \"4990eb3f-e7c0-459e-92d0-4774326e8f51\" (UID: \"4990eb3f-e7c0-459e-92d0-4774326e8f51\") " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.821995 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4990eb3f-e7c0-459e-92d0-4774326e8f51" (UID: "4990eb3f-e7c0-459e-92d0-4774326e8f51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.825832 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-logs" (OuterVolumeSpecName: "logs") pod "4990eb3f-e7c0-459e-92d0-4774326e8f51" (UID: "4990eb3f-e7c0-459e-92d0-4774326e8f51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.827836 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "4990eb3f-e7c0-459e-92d0-4774326e8f51" (UID: "4990eb3f-e7c0-459e-92d0-4774326e8f51"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.832109 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-scripts" (OuterVolumeSpecName: "scripts") pod "4990eb3f-e7c0-459e-92d0-4774326e8f51" (UID: "4990eb3f-e7c0-459e-92d0-4774326e8f51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.841019 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4990eb3f-e7c0-459e-92d0-4774326e8f51-kube-api-access-jnw4v" (OuterVolumeSpecName: "kube-api-access-jnw4v") pod "4990eb3f-e7c0-459e-92d0-4774326e8f51" (UID: "4990eb3f-e7c0-459e-92d0-4774326e8f51"). InnerVolumeSpecName "kube-api-access-jnw4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.885323 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4990eb3f-e7c0-459e-92d0-4774326e8f51" (UID: "4990eb3f-e7c0-459e-92d0-4774326e8f51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.896496 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-config-data" (OuterVolumeSpecName: "config-data") pod "4990eb3f-e7c0-459e-92d0-4774326e8f51" (UID: "4990eb3f-e7c0-459e-92d0-4774326e8f51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.921722 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.921800 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.921817 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnw4v\" (UniqueName: \"kubernetes.io/projected/4990eb3f-e7c0-459e-92d0-4774326e8f51-kube-api-access-jnw4v\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.921836 4878 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4990eb3f-e7c0-459e-92d0-4774326e8f51-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.921849 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.921859 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.921871 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4990eb3f-e7c0-459e-92d0-4774326e8f51-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:37:59 crc kubenswrapper[4878]: I1202 18:37:59.953378 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.025430 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.124887 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.131830 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4990eb3f-e7c0-459e-92d0-4774326e8f51","Type":"ContainerDied","Data":"3ac14fc94df6b86ac9cbc4b70a4476b9108b91ed83233cf8094b577738da3efb"} Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.131913 4878 scope.go:117] "RemoveContainer" containerID="a2d83c4f21cf0308bf9aef16c8cd4a7f3df3c70e051218112f23f105a8b0e74b" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.175634 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.190783 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.216325 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:00 crc kubenswrapper[4878]: E1202 18:38:00.218459 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-log" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.218493 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-log" Dec 02 18:38:00 crc kubenswrapper[4878]: E1202 18:38:00.218554 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-httpd" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.218564 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-httpd" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.229462 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-httpd" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.229517 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" containerName="glance-log" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.230994 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.231099 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.239517 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.239791 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.341673 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.341922 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.342324 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgkv2\" (UniqueName: \"kubernetes.io/projected/213894fc-d7f0-4fd6-9c48-83b91a9b7872-kube-api-access-xgkv2\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.342605 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.342766 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-logs\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.342853 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-config-data\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.343172 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.343267 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-scripts\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.447880 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-scripts\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.448135 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.451295 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.451404 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgkv2\" (UniqueName: \"kubernetes.io/projected/213894fc-d7f0-4fd6-9c48-83b91a9b7872-kube-api-access-xgkv2\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.451425 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.451461 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-logs\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.451489 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-config-data\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.451599 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.451773 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-scripts\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.452027 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.452035 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-logs\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.452297 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.456125 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.456467 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-config-data\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.456546 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.477377 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgkv2\" (UniqueName: \"kubernetes.io/projected/213894fc-d7f0-4fd6-9c48-83b91a9b7872-kube-api-access-xgkv2\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.505178 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.590008 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:00 crc kubenswrapper[4878]: I1202 18:38:00.964102 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4990eb3f-e7c0-459e-92d0-4774326e8f51" path="/var/lib/kubelet/pods/4990eb3f-e7c0-459e-92d0-4774326e8f51/volumes" Dec 02 18:38:01 crc kubenswrapper[4878]: I1202 18:38:01.564710 4878 scope.go:117] "RemoveContainer" containerID="2ea3306085b02df416b0d75b24c055df1a69f81e7c958590dfe6d2d1d1c305cc" Dec 02 18:38:01 crc kubenswrapper[4878]: I1202 18:38:01.879752 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6lds" Dec 02 18:38:01 crc kubenswrapper[4878]: I1202 18:38:01.923194 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t8ds" Dec 02 18:38:01 crc kubenswrapper[4878]: I1202 18:38:01.954505 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.000305 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-combined-ca-bundle\") pod \"d0bdf1c8-9578-4295-a835-432049516d07\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.000527 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-487jg\" (UniqueName: \"kubernetes.io/projected/d0bdf1c8-9578-4295-a835-432049516d07-kube-api-access-487jg\") pod \"d0bdf1c8-9578-4295-a835-432049516d07\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.000666 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-db-sync-config-data\") pod \"d0bdf1c8-9578-4295-a835-432049516d07\" (UID: \"d0bdf1c8-9578-4295-a835-432049516d07\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.016380 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d0bdf1c8-9578-4295-a835-432049516d07" (UID: "d0bdf1c8-9578-4295-a835-432049516d07"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.018623 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bdf1c8-9578-4295-a835-432049516d07-kube-api-access-487jg" (OuterVolumeSpecName: "kube-api-access-487jg") pod "d0bdf1c8-9578-4295-a835-432049516d07" (UID: "d0bdf1c8-9578-4295-a835-432049516d07"). InnerVolumeSpecName "kube-api-access-487jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.058316 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0bdf1c8-9578-4295-a835-432049516d07" (UID: "d0bdf1c8-9578-4295-a835-432049516d07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.103036 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-scripts\") pod \"670fbeeb-ca87-4024-b8d5-ddd470241386\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.103096 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75hj8\" (UniqueName: \"kubernetes.io/projected/c702f83c-0fa3-4ba7-b525-5eddf84355a8-kube-api-access-75hj8\") pod \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.103220 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-combined-ca-bundle\") pod \"670fbeeb-ca87-4024-b8d5-ddd470241386\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.103262 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-credential-keys\") pod \"670fbeeb-ca87-4024-b8d5-ddd470241386\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.103302 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-fernet-keys\") pod \"670fbeeb-ca87-4024-b8d5-ddd470241386\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.109429 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-combined-ca-bundle\") pod \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.109481 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sctn2\" (UniqueName: \"kubernetes.io/projected/670fbeeb-ca87-4024-b8d5-ddd470241386-kube-api-access-sctn2\") pod \"670fbeeb-ca87-4024-b8d5-ddd470241386\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.109590 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-scripts\") pod \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.109677 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-config-data\") pod \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.109708 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c702f83c-0fa3-4ba7-b525-5eddf84355a8-logs\") pod \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\" (UID: \"c702f83c-0fa3-4ba7-b525-5eddf84355a8\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.109757 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-config-data\") pod \"670fbeeb-ca87-4024-b8d5-ddd470241386\" (UID: \"670fbeeb-ca87-4024-b8d5-ddd470241386\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.111292 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c702f83c-0fa3-4ba7-b525-5eddf84355a8-kube-api-access-75hj8" (OuterVolumeSpecName: "kube-api-access-75hj8") pod "c702f83c-0fa3-4ba7-b525-5eddf84355a8" (UID: "c702f83c-0fa3-4ba7-b525-5eddf84355a8"). InnerVolumeSpecName "kube-api-access-75hj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.111737 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-scripts" (OuterVolumeSpecName: "scripts") pod "670fbeeb-ca87-4024-b8d5-ddd470241386" (UID: "670fbeeb-ca87-4024-b8d5-ddd470241386"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.111807 4878 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.111826 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdf1c8-9578-4295-a835-432049516d07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.111836 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75hj8\" (UniqueName: \"kubernetes.io/projected/c702f83c-0fa3-4ba7-b525-5eddf84355a8-kube-api-access-75hj8\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.111854 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-487jg\" (UniqueName: \"kubernetes.io/projected/d0bdf1c8-9578-4295-a835-432049516d07-kube-api-access-487jg\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.112030 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "670fbeeb-ca87-4024-b8d5-ddd470241386" (UID: "670fbeeb-ca87-4024-b8d5-ddd470241386"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.113509 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c702f83c-0fa3-4ba7-b525-5eddf84355a8-logs" (OuterVolumeSpecName: "logs") pod "c702f83c-0fa3-4ba7-b525-5eddf84355a8" (UID: "c702f83c-0fa3-4ba7-b525-5eddf84355a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.115889 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "670fbeeb-ca87-4024-b8d5-ddd470241386" (UID: "670fbeeb-ca87-4024-b8d5-ddd470241386"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.117454 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-scripts" (OuterVolumeSpecName: "scripts") pod "c702f83c-0fa3-4ba7-b525-5eddf84355a8" (UID: "c702f83c-0fa3-4ba7-b525-5eddf84355a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.121973 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/670fbeeb-ca87-4024-b8d5-ddd470241386-kube-api-access-sctn2" (OuterVolumeSpecName: "kube-api-access-sctn2") pod "670fbeeb-ca87-4024-b8d5-ddd470241386" (UID: "670fbeeb-ca87-4024-b8d5-ddd470241386"). InnerVolumeSpecName "kube-api-access-sctn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.159955 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-config-data" (OuterVolumeSpecName: "config-data") pod "670fbeeb-ca87-4024-b8d5-ddd470241386" (UID: "670fbeeb-ca87-4024-b8d5-ddd470241386"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.160754 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x42mc" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.161174 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x42mc" event={"ID":"670fbeeb-ca87-4024-b8d5-ddd470241386","Type":"ContainerDied","Data":"2146f2f6ffb43ef11195adfa078d2b97f7353aaaacd55f26cad82b007a470bde"} Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.161231 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2146f2f6ffb43ef11195adfa078d2b97f7353aaaacd55f26cad82b007a470bde" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.166485 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p669" event={"ID":"c17b3e4f-ede8-45e3-86df-c2a7813744de","Type":"ContainerStarted","Data":"22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2"} Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.173084 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t8ds" event={"ID":"c702f83c-0fa3-4ba7-b525-5eddf84355a8","Type":"ContainerDied","Data":"6909d837f3c1286ad4dec02a22c83cfe379ca9fc7ef454fa8abc821168da2e20"} Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.173124 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6909d837f3c1286ad4dec02a22c83cfe379ca9fc7ef454fa8abc821168da2e20" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.173206 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t8ds" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.177830 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c702f83c-0fa3-4ba7-b525-5eddf84355a8" (UID: "c702f83c-0fa3-4ba7-b525-5eddf84355a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.182098 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"493fa75c3c2ed1060f2fe4d56f89ab4b12fb92db746356f8251f8eb02752594f"} Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.193438 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-config-data" (OuterVolumeSpecName: "config-data") pod "c702f83c-0fa3-4ba7-b525-5eddf84355a8" (UID: "c702f83c-0fa3-4ba7-b525-5eddf84355a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.193566 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h6lds" event={"ID":"d0bdf1c8-9578-4295-a835-432049516d07","Type":"ContainerDied","Data":"159b5b8146e0791c32c3f5feb11ee158eb4cb2f250ea5d55123b2da8e28c9f4e"} Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.193603 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159b5b8146e0791c32c3f5feb11ee158eb4cb2f250ea5d55123b2da8e28c9f4e" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.193873 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h6lds" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.198310 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "670fbeeb-ca87-4024-b8d5-ddd470241386" (UID: "670fbeeb-ca87-4024-b8d5-ddd470241386"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.198537 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerStarted","Data":"8a2d380ab314e3f2d2461cba9a25983e81dce31646b910f0b5d13bb0db601bd0"} Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.205291 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.214983 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215182 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c702f83c-0fa3-4ba7-b525-5eddf84355a8-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215219 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215256 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215275 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215287 4878 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215298 4878 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/670fbeeb-ca87-4024-b8d5-ddd470241386-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215309 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215320 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sctn2\" (UniqueName: \"kubernetes.io/projected/670fbeeb-ca87-4024-b8d5-ddd470241386-kube-api-access-sctn2\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.215332 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c702f83c-0fa3-4ba7-b525-5eddf84355a8-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.316737 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-combined-ca-bundle\") pod \"c847569d-3cae-43c6-94df-5032c1b52ed1\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.317153 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dkv\" (UniqueName: \"kubernetes.io/projected/c847569d-3cae-43c6-94df-5032c1b52ed1-kube-api-access-w8dkv\") pod \"c847569d-3cae-43c6-94df-5032c1b52ed1\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.317275 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-logs\") pod \"c847569d-3cae-43c6-94df-5032c1b52ed1\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.317339 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-scripts\") pod \"c847569d-3cae-43c6-94df-5032c1b52ed1\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.317449 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-httpd-run\") pod \"c847569d-3cae-43c6-94df-5032c1b52ed1\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.317482 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c847569d-3cae-43c6-94df-5032c1b52ed1\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.317639 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-config-data\") pod \"c847569d-3cae-43c6-94df-5032c1b52ed1\" (UID: \"c847569d-3cae-43c6-94df-5032c1b52ed1\") " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.323996 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-scripts" (OuterVolumeSpecName: "scripts") pod "c847569d-3cae-43c6-94df-5032c1b52ed1" (UID: "c847569d-3cae-43c6-94df-5032c1b52ed1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.328686 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-logs" (OuterVolumeSpecName: "logs") pod "c847569d-3cae-43c6-94df-5032c1b52ed1" (UID: "c847569d-3cae-43c6-94df-5032c1b52ed1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.328731 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c847569d-3cae-43c6-94df-5032c1b52ed1" (UID: "c847569d-3cae-43c6-94df-5032c1b52ed1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.329095 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c847569d-3cae-43c6-94df-5032c1b52ed1" (UID: "c847569d-3cae-43c6-94df-5032c1b52ed1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.329764 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c847569d-3cae-43c6-94df-5032c1b52ed1-kube-api-access-w8dkv" (OuterVolumeSpecName: "kube-api-access-w8dkv") pod "c847569d-3cae-43c6-94df-5032c1b52ed1" (UID: "c847569d-3cae-43c6-94df-5032c1b52ed1"). InnerVolumeSpecName "kube-api-access-w8dkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.359916 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c847569d-3cae-43c6-94df-5032c1b52ed1" (UID: "c847569d-3cae-43c6-94df-5032c1b52ed1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.398337 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-config-data" (OuterVolumeSpecName: "config-data") pod "c847569d-3cae-43c6-94df-5032c1b52ed1" (UID: "c847569d-3cae-43c6-94df-5032c1b52ed1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.417822 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.420433 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.420544 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.420639 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dkv\" (UniqueName: \"kubernetes.io/projected/c847569d-3cae-43c6-94df-5032c1b52ed1-kube-api-access-w8dkv\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.420718 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.420794 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c847569d-3cae-43c6-94df-5032c1b52ed1-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.420997 4878 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c847569d-3cae-43c6-94df-5032c1b52ed1-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.421107 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.459068 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 02 18:38:02 crc kubenswrapper[4878]: I1202 18:38:02.523618 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.152771 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54c9cb88c9-bzlhb"] Dec 02 18:38:03 crc kubenswrapper[4878]: E1202 18:38:03.154353 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c702f83c-0fa3-4ba7-b525-5eddf84355a8" containerName="placement-db-sync" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.154374 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c702f83c-0fa3-4ba7-b525-5eddf84355a8" containerName="placement-db-sync" Dec 02 18:38:03 crc kubenswrapper[4878]: E1202 18:38:03.154421 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-log" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.154427 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-log" Dec 02 18:38:03 crc kubenswrapper[4878]: E1202 18:38:03.154459 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670fbeeb-ca87-4024-b8d5-ddd470241386" containerName="keystone-bootstrap" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.154466 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="670fbeeb-ca87-4024-b8d5-ddd470241386" containerName="keystone-bootstrap" Dec 02 18:38:03 crc kubenswrapper[4878]: E1202 18:38:03.154481 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-httpd" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.154486 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-httpd" Dec 02 18:38:03 crc kubenswrapper[4878]: E1202 18:38:03.154506 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bdf1c8-9578-4295-a835-432049516d07" containerName="barbican-db-sync" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.154512 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bdf1c8-9578-4295-a835-432049516d07" containerName="barbican-db-sync" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.167496 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-log" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.167545 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="670fbeeb-ca87-4024-b8d5-ddd470241386" containerName="keystone-bootstrap" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.167566 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c702f83c-0fa3-4ba7-b525-5eddf84355a8" containerName="placement-db-sync" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.167595 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bdf1c8-9578-4295-a835-432049516d07" containerName="barbican-db-sync" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.167615 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" containerName="glance-httpd" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.170289 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.191557 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4cbkh" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.192517 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.195370 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.195523 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54c9cb88c9-bzlhb"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.262092 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"da4f625707e47fe3e10e555e2c8fb97e5da7bfca4d01dec059df1d5a74609aaa"} Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.278275 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bbd45c784-zz4hz"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.283384 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.285890 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c847569d-3cae-43c6-94df-5032c1b52ed1","Type":"ContainerDied","Data":"40108cf04b3939ab5b78604fafbf5b030379f8d56153960b817a8292c76a6362"} Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.285968 4878 scope.go:117] "RemoveContainer" containerID="f26f82c91f8c6c269dadbf982eb2cd9d57501c90e745e9dae8a78986ea2f669d" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.286227 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.290003 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.290274 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.290437 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.290464 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-52ghn" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.290564 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.291464 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c992f2a3-c18e-470d-b4dc-168a9dcd8528-logs\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.291664 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-69bfdb774b-284fr"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.291650 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmmk\" (UniqueName: \"kubernetes.io/projected/c992f2a3-c18e-470d-b4dc-168a9dcd8528-kube-api-access-5xmmk\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.292805 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-config-data\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.293166 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-combined-ca-bundle\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.293291 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-config-data-custom\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.303043 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"213894fc-d7f0-4fd6-9c48-83b91a9b7872","Type":"ContainerStarted","Data":"d672f4df5eb8415ce5abde765082b8f64ae7302ce3cdfbe036d952df66a342ae"} Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.303217 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.307742 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bbd45c784-zz4hz"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.321007 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69bfdb774b-284fr"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.321193 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.363190 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-wpj5n"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.363578 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerName="dnsmasq-dns" containerID="cri-o://1a4e5ade1c42b2d47276d4d0574a7fdce7c94d83c08f4b534aec1430806d1d8b" gracePeriod=10 Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.371556 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.379254 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69687b58c4-bbvhw"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.395192 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c992f2a3-c18e-470d-b4dc-168a9dcd8528-logs\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.398639 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.411813 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c992f2a3-c18e-470d-b4dc-168a9dcd8528-logs\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.412455 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfnrm\" (UniqueName: \"kubernetes.io/projected/5d63745b-034f-4f6f-b2f7-abeca299930b-kube-api-access-rfnrm\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.412621 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-public-tls-certs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.412648 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-internal-tls-certs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.412714 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-config-data\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.412791 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmmk\" (UniqueName: \"kubernetes.io/projected/c992f2a3-c18e-470d-b4dc-168a9dcd8528-kube-api-access-5xmmk\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.412859 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-combined-ca-bundle\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.412927 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d63745b-034f-4f6f-b2f7-abeca299930b-logs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.413063 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-config-data\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.413086 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-scripts\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.413159 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-combined-ca-bundle\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.413288 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-config-data-custom\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.417574 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.436699 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-config-data\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.436784 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69687b58c4-bbvhw"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.446096 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-combined-ca-bundle\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.454453 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c992f2a3-c18e-470d-b4dc-168a9dcd8528-config-data-custom\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.474514 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-8k5pc"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.476221 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmmk\" (UniqueName: \"kubernetes.io/projected/c992f2a3-c18e-470d-b4dc-168a9dcd8528-kube-api-access-5xmmk\") pod \"barbican-worker-54c9cb88c9-bzlhb\" (UID: \"c992f2a3-c18e-470d-b4dc-168a9dcd8528\") " pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.478772 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.511774 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54c9cb88c9-bzlhb" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.517126 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-config-data-custom\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.517192 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4828d24-fa12-4cf6-9e5b-8864d62c8536-logs\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.517227 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519351 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09c6c49a-fb99-4649-9a4e-a290070d77e7-logs\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519439 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-combined-ca-bundle\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519477 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519528 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfnrm\" (UniqueName: \"kubernetes.io/projected/5d63745b-034f-4f6f-b2f7-abeca299930b-kube-api-access-rfnrm\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519554 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdq6d\" (UniqueName: \"kubernetes.io/projected/8efbd07a-9303-4d37-8aa6-f660a919d2fd-kube-api-access-xdq6d\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519585 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-public-tls-certs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519602 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-internal-tls-certs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519631 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t25jh\" (UniqueName: \"kubernetes.io/projected/09c6c49a-fb99-4649-9a4e-a290070d77e7-kube-api-access-t25jh\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519655 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-config-data\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519688 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-config\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519744 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-combined-ca-bundle\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519780 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggf7f\" (UniqueName: \"kubernetes.io/projected/e4828d24-fa12-4cf6-9e5b-8864d62c8536-kube-api-access-ggf7f\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519819 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-config-data\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519855 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d63745b-034f-4f6f-b2f7-abeca299930b-logs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519873 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519928 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-scripts\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519954 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data-custom\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.519982 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-combined-ca-bundle\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.520012 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.532018 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-public-tls-certs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.532354 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d63745b-034f-4f6f-b2f7-abeca299930b-logs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.532660 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.536373 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-scripts\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.537356 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-config-data\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.544334 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.555327 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-internal-tls-certs\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.556147 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfnrm\" (UniqueName: \"kubernetes.io/projected/5d63745b-034f-4f6f-b2f7-abeca299930b-kube-api-access-rfnrm\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.559685 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-67bf9d8f54-s7vnk"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.561601 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d63745b-034f-4f6f-b2f7-abeca299930b-combined-ca-bundle\") pod \"placement-5bbd45c784-zz4hz\" (UID: \"5d63745b-034f-4f6f-b2f7-abeca299930b\") " pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.563930 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.570674 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.571617 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.573184 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.573486 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.575843 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.576062 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jjqdr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.581615 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-8k5pc"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.607737 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.609862 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.610649 4878 scope.go:117] "RemoveContainer" containerID="56851e21238cf131ed97f8531c40187665c83a4914791a96262e6a6ea4d54c63" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.615070 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.615639 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.629832 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-combined-ca-bundle\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.686950 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.660269 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-combined-ca-bundle\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.646305 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.709833 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdq6d\" (UniqueName: \"kubernetes.io/projected/8efbd07a-9303-4d37-8aa6-f660a919d2fd-kube-api-access-xdq6d\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.709983 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t25jh\" (UniqueName: \"kubernetes.io/projected/09c6c49a-fb99-4649-9a4e-a290070d77e7-kube-api-access-t25jh\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.710023 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-config\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.710119 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggf7f\" (UniqueName: \"kubernetes.io/projected/e4828d24-fa12-4cf6-9e5b-8864d62c8536-kube-api-access-ggf7f\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.710150 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-config-data\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.710195 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.710342 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data-custom\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.711306 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-combined-ca-bundle\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.711363 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.711411 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-config-data-custom\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.711467 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4828d24-fa12-4cf6-9e5b-8864d62c8536-logs\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.711507 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.711531 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09c6c49a-fb99-4649-9a4e-a290070d77e7-logs\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.715045 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4828d24-fa12-4cf6-9e5b-8864d62c8536-logs\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.717358 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-nb\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.717390 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-sb\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.731618 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.734942 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-config-data-custom\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.737813 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggf7f\" (UniqueName: \"kubernetes.io/projected/e4828d24-fa12-4cf6-9e5b-8864d62c8536-kube-api-access-ggf7f\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.741968 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4828d24-fa12-4cf6-9e5b-8864d62c8536-config-data\") pod \"barbican-keystone-listener-69bfdb774b-284fr\" (UID: \"e4828d24-fa12-4cf6-9e5b-8864d62c8536\") " pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.750925 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data-custom\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.765405 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09c6c49a-fb99-4649-9a4e-a290070d77e7-logs\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.768937 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdq6d\" (UniqueName: \"kubernetes.io/projected/8efbd07a-9303-4d37-8aa6-f660a919d2fd-kube-api-access-xdq6d\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.769421 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-dns-svc\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.770552 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-config\") pod \"dnsmasq-dns-7d649d8c65-8k5pc\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.771881 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t25jh\" (UniqueName: \"kubernetes.io/projected/09c6c49a-fb99-4649-9a4e-a290070d77e7-kube-api-access-t25jh\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.774821 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-combined-ca-bundle\") pod \"barbican-api-69687b58c4-bbvhw\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.790295 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67bf9d8f54-s7vnk"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817324 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817459 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7db5f\" (UniqueName: \"kubernetes.io/projected/e4cdcc03-9890-4704-a31b-e8f8858140a5-kube-api-access-7db5f\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817519 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-fernet-keys\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817592 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcbf\" (UniqueName: \"kubernetes.io/projected/917539f3-4a78-4c46-a2e3-0b95342fe994-kube-api-access-7gcbf\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817627 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-credential-keys\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817695 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-scripts\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817786 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817863 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-internal-tls-certs\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817898 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-combined-ca-bundle\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.817941 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-config-data\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.818023 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.818092 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.818166 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.818216 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.820761 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-public-tls-certs\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.820902 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-logs\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.861044 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923413 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7db5f\" (UniqueName: \"kubernetes.io/projected/e4cdcc03-9890-4704-a31b-e8f8858140a5-kube-api-access-7db5f\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923479 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-fernet-keys\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923526 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcbf\" (UniqueName: \"kubernetes.io/projected/917539f3-4a78-4c46-a2e3-0b95342fe994-kube-api-access-7gcbf\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923572 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-credential-keys\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923622 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-scripts\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923706 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923757 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-internal-tls-certs\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923781 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-combined-ca-bundle\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923813 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-config-data\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923849 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923887 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923938 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923968 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.923995 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-public-tls-certs\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.924076 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-logs\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.924114 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.930430 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.935660 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.943784 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.944676 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.947958 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.950121 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-public-tls-certs\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.954977 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.956080 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.956838 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-logs\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.958232 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.959221 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7db5f\" (UniqueName: \"kubernetes.io/projected/e4cdcc03-9890-4704-a31b-e8f8858140a5-kube-api-access-7db5f\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:03 crc kubenswrapper[4878]: I1202 18:38:03.981316 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.042556 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-credential-keys\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.047874 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-config-data\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.053679 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-scripts\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.054646 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-fernet-keys\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.082393 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcbf\" (UniqueName: \"kubernetes.io/projected/917539f3-4a78-4c46-a2e3-0b95342fe994-kube-api-access-7gcbf\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.085252 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.133729 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-combined-ca-bundle\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.134214 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/917539f3-4a78-4c46-a2e3-0b95342fe994-internal-tls-certs\") pod \"keystone-67bf9d8f54-s7vnk\" (UID: \"917539f3-4a78-4c46-a2e3-0b95342fe994\") " pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.361042 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.373151 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.399384 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"9ca58ece3cb7172c75001849aefd22ddb2ccb59406f19e8816012520e2d17e25"} Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.466003 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: connect: connection refused" Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.519627 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bbd45c784-zz4hz"] Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.755086 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54c9cb88c9-bzlhb"] Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.885085 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-8k5pc"] Dec 02 18:38:04 crc kubenswrapper[4878]: I1202 18:38:04.983462 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c847569d-3cae-43c6-94df-5032c1b52ed1" path="/var/lib/kubelet/pods/c847569d-3cae-43c6-94df-5032c1b52ed1/volumes" Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.031719 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69687b58c4-bbvhw"] Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.130193 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69bfdb774b-284fr"] Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.153209 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67bf9d8f54-s7vnk"] Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.453534 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" event={"ID":"e4828d24-fa12-4cf6-9e5b-8864d62c8536","Type":"ContainerStarted","Data":"0f66c9bb0bb364324c0854b9893568d091e4babbd562612760f675594826fe65"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.485005 4878 generic.go:334] "Generic (PLEG): container finished" podID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerID="22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2" exitCode=0 Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.485823 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p669" event={"ID":"c17b3e4f-ede8-45e3-86df-c2a7813744de","Type":"ContainerDied","Data":"22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.491926 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"98fd77f3b471edd0646cb4eb8eeb181744f7931a2a1adc8ff993b2392f6d23ed"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.498213 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" event={"ID":"8efbd07a-9303-4d37-8aa6-f660a919d2fd","Type":"ContainerStarted","Data":"98ee5722fc8ebe45fac8f7bc9e540f68be347f67bbd311241af2f393676579db"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.502304 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67bf9d8f54-s7vnk" event={"ID":"917539f3-4a78-4c46-a2e3-0b95342fe994","Type":"ContainerStarted","Data":"9a7beace52976b9384636cb9f522a4eb8966a99732911a57b0a309ee469e799e"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.536140 4878 generic.go:334] "Generic (PLEG): container finished" podID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerID="1a4e5ade1c42b2d47276d4d0574a7fdce7c94d83c08f4b534aec1430806d1d8b" exitCode=0 Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.536308 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" event={"ID":"1ac2d806-c2e2-4d10-9975-e4ae079add40","Type":"ContainerDied","Data":"1a4e5ade1c42b2d47276d4d0574a7fdce7c94d83c08f4b534aec1430806d1d8b"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.541167 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bbd45c784-zz4hz" event={"ID":"5d63745b-034f-4f6f-b2f7-abeca299930b","Type":"ContainerStarted","Data":"18fa7bc3138a1d291a265e103c5a9f6377b0d562c7b6813f60bb25e082470546"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.541215 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bbd45c784-zz4hz" event={"ID":"5d63745b-034f-4f6f-b2f7-abeca299930b","Type":"ContainerStarted","Data":"4a375c6309e052cfb31b0de2fc4a47cf7dabea90a9d05c77f3c5672238a14190"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.560130 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69687b58c4-bbvhw" event={"ID":"09c6c49a-fb99-4649-9a4e-a290070d77e7","Type":"ContainerStarted","Data":"8f7c0523d134294a19cbace6c37fdbea5577f28bb7167abf78da01a71cc2bf36"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.569740 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54c9cb88c9-bzlhb" event={"ID":"c992f2a3-c18e-470d-b4dc-168a9dcd8528","Type":"ContainerStarted","Data":"d1179e46270bb432c99d60558025cecd608f5de21c43e82ae0e39a947adf2588"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.580965 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"213894fc-d7f0-4fd6-9c48-83b91a9b7872","Type":"ContainerStarted","Data":"3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c"} Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.693956 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.830888 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-59qgr"] Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.836147 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.904326 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-59qgr"] Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.969300 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4n8\" (UniqueName: \"kubernetes.io/projected/ba4a8419-5172-47bf-a836-b88e734e919b-kube-api-access-kt4n8\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.969410 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-catalog-content\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:05 crc kubenswrapper[4878]: I1202 18:38:05.969488 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-utilities\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.072133 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-catalog-content\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.072653 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-utilities\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.072867 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4n8\" (UniqueName: \"kubernetes.io/projected/ba4a8419-5172-47bf-a836-b88e734e919b-kube-api-access-kt4n8\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.073779 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-catalog-content\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.077164 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-utilities\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.106036 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4n8\" (UniqueName: \"kubernetes.io/projected/ba4a8419-5172-47bf-a836-b88e734e919b-kube-api-access-kt4n8\") pod \"redhat-marketplace-59qgr\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.190436 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.253172 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.379097 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-nb\") pod \"1ac2d806-c2e2-4d10-9975-e4ae079add40\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.379601 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-dns-svc\") pod \"1ac2d806-c2e2-4d10-9975-e4ae079add40\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.379638 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vktx\" (UniqueName: \"kubernetes.io/projected/1ac2d806-c2e2-4d10-9975-e4ae079add40-kube-api-access-8vktx\") pod \"1ac2d806-c2e2-4d10-9975-e4ae079add40\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.379777 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-sb\") pod \"1ac2d806-c2e2-4d10-9975-e4ae079add40\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.379883 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-config\") pod \"1ac2d806-c2e2-4d10-9975-e4ae079add40\" (UID: \"1ac2d806-c2e2-4d10-9975-e4ae079add40\") " Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.392975 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac2d806-c2e2-4d10-9975-e4ae079add40-kube-api-access-8vktx" (OuterVolumeSpecName: "kube-api-access-8vktx") pod "1ac2d806-c2e2-4d10-9975-e4ae079add40" (UID: "1ac2d806-c2e2-4d10-9975-e4ae079add40"). InnerVolumeSpecName "kube-api-access-8vktx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.493362 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vktx\" (UniqueName: \"kubernetes.io/projected/1ac2d806-c2e2-4d10-9975-e4ae079add40-kube-api-access-8vktx\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.655767 4878 generic.go:334] "Generic (PLEG): container finished" podID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerID="4d91e62096749d9285edb5e212deb8cd1e8db4b0d47782e31dad560146ba8b07" exitCode=0 Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.655874 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" event={"ID":"8efbd07a-9303-4d37-8aa6-f660a919d2fd","Type":"ContainerDied","Data":"4d91e62096749d9285edb5e212deb8cd1e8db4b0d47782e31dad560146ba8b07"} Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.682985 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4cdcc03-9890-4704-a31b-e8f8858140a5","Type":"ContainerStarted","Data":"d95c43b1a41fd9c3d82e36574b76c9d86153286d8ab56ef929caec39788920a5"} Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.708127 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69687b58c4-bbvhw" event={"ID":"09c6c49a-fb99-4649-9a4e-a290070d77e7","Type":"ContainerStarted","Data":"272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49"} Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.720895 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" event={"ID":"1ac2d806-c2e2-4d10-9975-e4ae079add40","Type":"ContainerDied","Data":"6525d58fda2d7965c52efa47e580e9686c65f950b8332cd975ed0ab4943f7b24"} Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.721168 4878 scope.go:117] "RemoveContainer" containerID="1a4e5ade1c42b2d47276d4d0574a7fdce7c94d83c08f4b534aec1430806d1d8b" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.721498 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb745b69-wpj5n" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.938694 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1ac2d806-c2e2-4d10-9975-e4ae079add40" (UID: "1ac2d806-c2e2-4d10-9975-e4ae079add40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.941798 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ac2d806-c2e2-4d10-9975-e4ae079add40" (UID: "1ac2d806-c2e2-4d10-9975-e4ae079add40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:06 crc kubenswrapper[4878]: I1202 18:38:06.947738 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1ac2d806-c2e2-4d10-9975-e4ae079add40" (UID: "1ac2d806-c2e2-4d10-9975-e4ae079add40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.001939 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-config" (OuterVolumeSpecName: "config") pod "1ac2d806-c2e2-4d10-9975-e4ae079add40" (UID: "1ac2d806-c2e2-4d10-9975-e4ae079add40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.024968 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.034021 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.034069 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.034085 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ac2d806-c2e2-4d10-9975-e4ae079add40-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.176886 4878 scope.go:117] "RemoveContainer" containerID="ebde465b711cdac3d58ef51ac9ebe5d9fe0b0bab517f7896b8fecf125c718975" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.202389 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-wpj5n"] Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.218171 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb745b69-wpj5n"] Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.229415 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-59qgr"] Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.632589 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58bf7d9584-2nldp"] Dec 02 18:38:07 crc kubenswrapper[4878]: E1202 18:38:07.633396 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerName="dnsmasq-dns" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.633411 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerName="dnsmasq-dns" Dec 02 18:38:07 crc kubenswrapper[4878]: E1202 18:38:07.633471 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerName="init" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.633477 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerName="init" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.633692 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" containerName="dnsmasq-dns" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.634979 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.650710 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.650973 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.658641 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58bf7d9584-2nldp"] Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.763694 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-internal-tls-certs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.763733 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-config-data\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.763822 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-config-data-custom\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.763865 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-public-tls-certs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.763910 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e21a97ae-d28a-4c3c-b669-bd186e06a311-logs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.763972 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcq6q\" (UniqueName: \"kubernetes.io/projected/e21a97ae-d28a-4c3c-b669-bd186e06a311-kube-api-access-bcq6q\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.764016 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-combined-ca-bundle\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.783626 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p669" event={"ID":"c17b3e4f-ede8-45e3-86df-c2a7813744de","Type":"ContainerStarted","Data":"36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.807546 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bbd45c784-zz4hz" event={"ID":"5d63745b-034f-4f6f-b2f7-abeca299930b","Type":"ContainerStarted","Data":"7d192acd30b1015a70cbf3beeb15a5b75651b261004e9d018769e2ac0ecebd48"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.808229 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.813944 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xp7h4" event={"ID":"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b","Type":"ContainerStarted","Data":"0e484753605b4722b782d98e2b810d1219c36e2dc5f8eff75ce4626630c57bb5"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.816548 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" event={"ID":"8efbd07a-9303-4d37-8aa6-f660a919d2fd","Type":"ContainerStarted","Data":"024e0ef1c974f2d743fd83e5813cd9757bc33b2bd4060d4672fc7b8a9ecfc86a"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.816839 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.831772 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2p669" podStartSLOduration=5.049347461 podStartE2EDuration="11.831746688s" podCreationTimestamp="2025-12-02 18:37:56 +0000 UTC" firstStartedPulling="2025-12-02 18:37:59.554926474 +0000 UTC m=+1389.244545355" lastFinishedPulling="2025-12-02 18:38:06.337325701 +0000 UTC m=+1396.026944582" observedRunningTime="2025-12-02 18:38:07.802756679 +0000 UTC m=+1397.492375560" watchObservedRunningTime="2025-12-02 18:38:07.831746688 +0000 UTC m=+1397.521365569" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.833994 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4cdcc03-9890-4704-a31b-e8f8858140a5","Type":"ContainerStarted","Data":"d41dad766c59745a147f4023eefa4a896098b8792ec0eb5d76efa27e7f4db6e3"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.848392 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69687b58c4-bbvhw" event={"ID":"09c6c49a-fb99-4649-9a4e-a290070d77e7","Type":"ContainerStarted","Data":"0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.849710 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.849748 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.857018 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bbd45c784-zz4hz" podStartSLOduration=4.854626267 podStartE2EDuration="4.854626267s" podCreationTimestamp="2025-12-02 18:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:07.834702449 +0000 UTC m=+1397.524321340" watchObservedRunningTime="2025-12-02 18:38:07.854626267 +0000 UTC m=+1397.544245148" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.858154 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"213894fc-d7f0-4fd6-9c48-83b91a9b7872","Type":"ContainerStarted","Data":"2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.866013 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-config-data\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.866059 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-internal-tls-certs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.868747 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-config-data-custom\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.868892 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-public-tls-certs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.869004 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e21a97ae-d28a-4c3c-b669-bd186e06a311-logs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.869153 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcq6q\" (UniqueName: \"kubernetes.io/projected/e21a97ae-d28a-4c3c-b669-bd186e06a311-kube-api-access-bcq6q\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.869248 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-combined-ca-bundle\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.873708 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e21a97ae-d28a-4c3c-b669-bd186e06a311-logs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.879327 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67bf9d8f54-s7vnk" event={"ID":"917539f3-4a78-4c46-a2e3-0b95342fe994","Type":"ContainerStarted","Data":"d3e92907607c084e78f5e2cf3573a62fd3a7811d4963e685df995c0dbb60c898"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.879352 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-public-tls-certs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.879923 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.881445 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-config-data\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.888517 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-config-data-custom\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.896702 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-internal-tls-certs\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.904952 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59qgr" event={"ID":"ba4a8419-5172-47bf-a836-b88e734e919b","Type":"ContainerStarted","Data":"74c717657858c80380ba3fd2c7f3922a0a643269a490c42d7c86d621e3b37915"} Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.909371 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" podStartSLOduration=4.909347115 podStartE2EDuration="4.909347115s" podCreationTimestamp="2025-12-02 18:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:07.897616371 +0000 UTC m=+1397.587235252" watchObservedRunningTime="2025-12-02 18:38:07.909347115 +0000 UTC m=+1397.598965996" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.915027 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcq6q\" (UniqueName: \"kubernetes.io/projected/e21a97ae-d28a-4c3c-b669-bd186e06a311-kube-api-access-bcq6q\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.915495 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e21a97ae-d28a-4c3c-b669-bd186e06a311-combined-ca-bundle\") pod \"barbican-api-58bf7d9584-2nldp\" (UID: \"e21a97ae-d28a-4c3c-b669-bd186e06a311\") " pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.915925 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-xp7h4" podStartSLOduration=4.730596819 podStartE2EDuration="52.915905198s" podCreationTimestamp="2025-12-02 18:37:15 +0000 UTC" firstStartedPulling="2025-12-02 18:37:17.218454014 +0000 UTC m=+1346.908072895" lastFinishedPulling="2025-12-02 18:38:05.403762393 +0000 UTC m=+1395.093381274" observedRunningTime="2025-12-02 18:38:07.874095641 +0000 UTC m=+1397.563714522" watchObservedRunningTime="2025-12-02 18:38:07.915905198 +0000 UTC m=+1397.605524079" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.952482 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-67bf9d8f54-s7vnk" podStartSLOduration=4.952460372 podStartE2EDuration="4.952460372s" podCreationTimestamp="2025-12-02 18:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:07.924368321 +0000 UTC m=+1397.613987202" watchObservedRunningTime="2025-12-02 18:38:07.952460372 +0000 UTC m=+1397.642079253" Dec 02 18:38:07 crc kubenswrapper[4878]: I1202 18:38:07.966604 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69687b58c4-bbvhw" podStartSLOduration=4.96658003 podStartE2EDuration="4.96658003s" podCreationTimestamp="2025-12-02 18:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:07.951783211 +0000 UTC m=+1397.641402092" watchObservedRunningTime="2025-12-02 18:38:07.96658003 +0000 UTC m=+1397.656198911" Dec 02 18:38:08 crc kubenswrapper[4878]: I1202 18:38:07.998887 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.998857212 podStartE2EDuration="7.998857212s" podCreationTimestamp="2025-12-02 18:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:07.989676547 +0000 UTC m=+1397.679295428" watchObservedRunningTime="2025-12-02 18:38:07.998857212 +0000 UTC m=+1397.688476093" Dec 02 18:38:08 crc kubenswrapper[4878]: I1202 18:38:08.075072 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:08 crc kubenswrapper[4878]: I1202 18:38:08.932396 4878 generic.go:334] "Generic (PLEG): container finished" podID="ba4a8419-5172-47bf-a836-b88e734e919b" containerID="f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba" exitCode=0 Dec 02 18:38:08 crc kubenswrapper[4878]: I1202 18:38:08.932945 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59qgr" event={"ID":"ba4a8419-5172-47bf-a836-b88e734e919b","Type":"ContainerDied","Data":"f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba"} Dec 02 18:38:08 crc kubenswrapper[4878]: I1202 18:38:08.934590 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:08 crc kubenswrapper[4878]: I1202 18:38:08.970737 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac2d806-c2e2-4d10-9975-e4ae079add40" path="/var/lib/kubelet/pods/1ac2d806-c2e2-4d10-9975-e4ae079add40/volumes" Dec 02 18:38:10 crc kubenswrapper[4878]: I1202 18:38:10.591126 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:10 crc kubenswrapper[4878]: I1202 18:38:10.591522 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:10 crc kubenswrapper[4878]: I1202 18:38:10.641461 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:10 crc kubenswrapper[4878]: I1202 18:38:10.671823 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:10 crc kubenswrapper[4878]: I1202 18:38:10.956402 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:10 crc kubenswrapper[4878]: I1202 18:38:10.956442 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:11 crc kubenswrapper[4878]: I1202 18:38:11.977572 4878 generic.go:334] "Generic (PLEG): container finished" podID="faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" containerID="0e484753605b4722b782d98e2b810d1219c36e2dc5f8eff75ce4626630c57bb5" exitCode=0 Dec 02 18:38:11 crc kubenswrapper[4878]: I1202 18:38:11.977650 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xp7h4" event={"ID":"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b","Type":"ContainerDied","Data":"0e484753605b4722b782d98e2b810d1219c36e2dc5f8eff75ce4626630c57bb5"} Dec 02 18:38:13 crc kubenswrapper[4878]: I1202 18:38:13.957506 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.028289 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-whmvk"] Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.028607 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" containerName="dnsmasq-dns" containerID="cri-o://c33f5928aa488145698a379f9b796661efbdf9a4b18f323276eb42f4a8997ab3" gracePeriod=10 Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.062630 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.314209 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xp7h4" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.374496 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d68c\" (UniqueName: \"kubernetes.io/projected/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-kube-api-access-2d68c\") pod \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.374914 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-combined-ca-bundle\") pod \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.374973 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-config-data\") pod \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\" (UID: \"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b\") " Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.399670 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-kube-api-access-2d68c" (OuterVolumeSpecName: "kube-api-access-2d68c") pod "faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" (UID: "faf8fbcd-b97a-45f4-8b17-c92fdc87d75b"). InnerVolumeSpecName "kube-api-access-2d68c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.470565 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" (UID: "faf8fbcd-b97a-45f4-8b17-c92fdc87d75b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.478444 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.478551 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d68c\" (UniqueName: \"kubernetes.io/projected/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-kube-api-access-2d68c\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.634420 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-config-data" (OuterVolumeSpecName: "config-data") pod "faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" (UID: "faf8fbcd-b97a-45f4-8b17-c92fdc87d75b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:14 crc kubenswrapper[4878]: I1202 18:38:14.688184 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:15 crc kubenswrapper[4878]: I1202 18:38:15.026881 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xp7h4" event={"ID":"faf8fbcd-b97a-45f4-8b17-c92fdc87d75b","Type":"ContainerDied","Data":"742842d3ca7ed959edc210840d5427b52003e9aa17aa11f4a88acebbef947a8e"} Dec 02 18:38:15 crc kubenswrapper[4878]: I1202 18:38:15.026951 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742842d3ca7ed959edc210840d5427b52003e9aa17aa11f4a88acebbef947a8e" Dec 02 18:38:15 crc kubenswrapper[4878]: I1202 18:38:15.026891 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xp7h4" Dec 02 18:38:15 crc kubenswrapper[4878]: I1202 18:38:15.030898 4878 generic.go:334] "Generic (PLEG): container finished" podID="55303a34-d76e-40fa-ba51-676df9fb7104" containerID="c33f5928aa488145698a379f9b796661efbdf9a4b18f323276eb42f4a8997ab3" exitCode=0 Dec 02 18:38:15 crc kubenswrapper[4878]: I1202 18:38:15.030967 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" event={"ID":"55303a34-d76e-40fa-ba51-676df9fb7104","Type":"ContainerDied","Data":"c33f5928aa488145698a379f9b796661efbdf9a4b18f323276eb42f4a8997ab3"} Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.317516 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.384124 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.386921 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-dns-svc\") pod \"55303a34-d76e-40fa-ba51-676df9fb7104\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.387019 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-nb\") pod \"55303a34-d76e-40fa-ba51-676df9fb7104\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.387149 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zsnw\" (UniqueName: \"kubernetes.io/projected/55303a34-d76e-40fa-ba51-676df9fb7104-kube-api-access-7zsnw\") pod \"55303a34-d76e-40fa-ba51-676df9fb7104\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.387187 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-config\") pod \"55303a34-d76e-40fa-ba51-676df9fb7104\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.387319 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-sb\") pod \"55303a34-d76e-40fa-ba51-676df9fb7104\" (UID: \"55303a34-d76e-40fa-ba51-676df9fb7104\") " Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.443171 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55303a34-d76e-40fa-ba51-676df9fb7104-kube-api-access-7zsnw" (OuterVolumeSpecName: "kube-api-access-7zsnw") pod "55303a34-d76e-40fa-ba51-676df9fb7104" (UID: "55303a34-d76e-40fa-ba51-676df9fb7104"). InnerVolumeSpecName "kube-api-access-7zsnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.518891 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zsnw\" (UniqueName: \"kubernetes.io/projected/55303a34-d76e-40fa-ba51-676df9fb7104-kube-api-access-7zsnw\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.559037 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.559697 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.575789 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55303a34-d76e-40fa-ba51-676df9fb7104" (UID: "55303a34-d76e-40fa-ba51-676df9fb7104"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.611555 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-config" (OuterVolumeSpecName: "config") pod "55303a34-d76e-40fa-ba51-676df9fb7104" (UID: "55303a34-d76e-40fa-ba51-676df9fb7104"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.611628 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55303a34-d76e-40fa-ba51-676df9fb7104" (UID: "55303a34-d76e-40fa-ba51-676df9fb7104"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.622451 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.622731 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.622743 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.636814 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55303a34-d76e-40fa-ba51-676df9fb7104" (UID: "55303a34-d76e-40fa-ba51-676df9fb7104"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.725697 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55303a34-d76e-40fa-ba51-676df9fb7104-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.879021 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:16 crc kubenswrapper[4878]: I1202 18:38:16.881617 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:17 crc kubenswrapper[4878]: I1202 18:38:17.108774 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" event={"ID":"55303a34-d76e-40fa-ba51-676df9fb7104","Type":"ContainerDied","Data":"e91b26d9c5b7e8209fe04b079c9301eda88e7576fb15314d893a028b1849f859"} Dec 02 18:38:17 crc kubenswrapper[4878]: I1202 18:38:17.108849 4878 scope.go:117] "RemoveContainer" containerID="c33f5928aa488145698a379f9b796661efbdf9a4b18f323276eb42f4a8997ab3" Dec 02 18:38:17 crc kubenswrapper[4878]: I1202 18:38:17.108919 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" Dec 02 18:38:17 crc kubenswrapper[4878]: I1202 18:38:17.141755 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-whmvk"] Dec 02 18:38:17 crc kubenswrapper[4878]: I1202 18:38:17.152100 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-whmvk"] Dec 02 18:38:17 crc kubenswrapper[4878]: I1202 18:38:17.457293 4878 scope.go:117] "RemoveContainer" containerID="17ae2f3b6e85b687c0825e4dc3eb6924cf6d2f104c8c522ecfd3364fde9c8516" Dec 02 18:38:17 crc kubenswrapper[4878]: I1202 18:38:17.645526 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2p669" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="registry-server" probeResult="failure" output=< Dec 02 18:38:17 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 18:38:17 crc kubenswrapper[4878]: > Dec 02 18:38:18 crc kubenswrapper[4878]: I1202 18:38:18.054331 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58bf7d9584-2nldp"] Dec 02 18:38:18 crc kubenswrapper[4878]: I1202 18:38:18.192529 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"3587203bf2bf05eaff37f1c704ec302312598c5be87a6141199a31adbf59a4f4"} Dec 02 18:38:18 crc kubenswrapper[4878]: I1202 18:38:18.195697 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59qgr" event={"ID":"ba4a8419-5172-47bf-a836-b88e734e919b","Type":"ContainerStarted","Data":"c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60"} Dec 02 18:38:18 crc kubenswrapper[4878]: I1202 18:38:18.224332 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bf7d9584-2nldp" event={"ID":"e21a97ae-d28a-4c3c-b669-bd186e06a311","Type":"ContainerStarted","Data":"4d8a0ffd7a1a94bf4d34b59f5000c5333b79d113e09172cd994be1c53c1294b8"} Dec 02 18:38:18 crc kubenswrapper[4878]: I1202 18:38:18.260755 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54c9cb88c9-bzlhb" event={"ID":"c992f2a3-c18e-470d-b4dc-168a9dcd8528","Type":"ContainerStarted","Data":"7a4a8d5f5b83c3b13dd00c9aac03088fdc083d3b1d615403d7fdc5007405bbba"} Dec 02 18:38:18 crc kubenswrapper[4878]: E1202 18:38:18.771326 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" Dec 02 18:38:18 crc kubenswrapper[4878]: I1202 18:38:18.982457 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" path="/var/lib/kubelet/pods/55303a34-d76e-40fa-ba51-676df9fb7104/volumes" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.292307 4878 generic.go:334] "Generic (PLEG): container finished" podID="ba4a8419-5172-47bf-a836-b88e734e919b" containerID="c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60" exitCode=0 Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.292399 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59qgr" event={"ID":"ba4a8419-5172-47bf-a836-b88e734e919b","Type":"ContainerDied","Data":"c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.294473 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qqjx" event={"ID":"d7252936-ed87-47f7-b392-7d8fe8388279","Type":"ContainerStarted","Data":"bea3d4d9b0e638bd1c0e1ca7227f69f3d655d8f3beaeabc0bd0056f9a2690148"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.311044 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bf7d9584-2nldp" event={"ID":"e21a97ae-d28a-4c3c-b669-bd186e06a311","Type":"ContainerStarted","Data":"d57dcfa66c8c6000dce0ab47eddee0130ee5e691dbcbea4e734ee2eab87d803d"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.311110 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bf7d9584-2nldp" event={"ID":"e21a97ae-d28a-4c3c-b669-bd186e06a311","Type":"ContainerStarted","Data":"bb2b05e0f44417f366ad0b63c11a594bc4727c6bd7c71469578cb973753c9679"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.311595 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.311720 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.318064 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerStarted","Data":"b1bf248f7736fc0e2a16f6369df8f9d6c3471c00c8d13baab8694b7da2bb40df"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.321690 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" event={"ID":"e4828d24-fa12-4cf6-9e5b-8864d62c8536","Type":"ContainerStarted","Data":"6848c77cd3ce5d373fc0803569b06990b13f710f63e7941e66b5c07ea6326932"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.321841 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" event={"ID":"e4828d24-fa12-4cf6-9e5b-8864d62c8536","Type":"ContainerStarted","Data":"ad1fe7313a184efbb69cc3e5a6c5cdac6c2600f757cbeb305bf8a52be04c6b60"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.326101 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="ceilometer-notification-agent" containerID="cri-o://f19f8e8a27dedec47681745735d3fe1dc415d5584ee88c8706ebe7751d504659" gracePeriod=30 Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.326367 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.326434 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="sg-core" containerID="cri-o://8a2d380ab314e3f2d2461cba9a25983e81dce31646b910f0b5d13bb0db601bd0" gracePeriod=30 Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.326486 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="proxy-httpd" containerID="cri-o://b1bf248f7736fc0e2a16f6369df8f9d6c3471c00c8d13baab8694b7da2bb40df" gracePeriod=30 Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.359890 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4cdcc03-9890-4704-a31b-e8f8858140a5","Type":"ContainerStarted","Data":"3719d30fd4b727fd8781ab06800fe1878b068f02080b2ae83116f881d227dc6d"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.375649 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54c9cb88c9-bzlhb" event={"ID":"c992f2a3-c18e-470d-b4dc-168a9dcd8528","Type":"ContainerStarted","Data":"cdd2488d1a08558738ffe35249f4f78d70901b47101afa565e5a805f1554e6d1"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.444769 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-69bfdb774b-284fr" podStartSLOduration=4.313975647 podStartE2EDuration="16.4447429s" podCreationTimestamp="2025-12-02 18:38:03 +0000 UTC" firstStartedPulling="2025-12-02 18:38:05.326761044 +0000 UTC m=+1395.016379925" lastFinishedPulling="2025-12-02 18:38:17.457528297 +0000 UTC m=+1407.147147178" observedRunningTime="2025-12-02 18:38:19.412123928 +0000 UTC m=+1409.101742809" watchObservedRunningTime="2025-12-02 18:38:19.4447429 +0000 UTC m=+1409.134361781" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.455562 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58bf7d9584-2nldp" podStartSLOduration=12.455544255 podStartE2EDuration="12.455544255s" podCreationTimestamp="2025-12-02 18:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:19.373950954 +0000 UTC m=+1409.063569835" watchObservedRunningTime="2025-12-02 18:38:19.455544255 +0000 UTC m=+1409.145163136" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.470762 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"b486cd69b18c1df6177f2beea410b5c89050e941469268093eb37fd697080e9d"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.470828 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"81ac7b392ecb7fc11d1d605726ee83a3187ec6080c06514d304268aab333c1cf"} Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.499993 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6qqjx" podStartSLOduration=4.798152516 podStartE2EDuration="1m4.499973403s" podCreationTimestamp="2025-12-02 18:37:15 +0000 UTC" firstStartedPulling="2025-12-02 18:37:17.868622643 +0000 UTC m=+1347.558241524" lastFinishedPulling="2025-12-02 18:38:17.57044353 +0000 UTC m=+1407.260062411" observedRunningTime="2025-12-02 18:38:19.489448336 +0000 UTC m=+1409.179067217" watchObservedRunningTime="2025-12-02 18:38:19.499973403 +0000 UTC m=+1409.189592284" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.522626 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.522609335 podStartE2EDuration="16.522609335s" podCreationTimestamp="2025-12-02 18:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:19.515689611 +0000 UTC m=+1409.205308492" watchObservedRunningTime="2025-12-02 18:38:19.522609335 +0000 UTC m=+1409.212228206" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.561100 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54c9cb88c9-bzlhb" podStartSLOduration=5.109505645 podStartE2EDuration="16.561068019s" podCreationTimestamp="2025-12-02 18:38:03 +0000 UTC" firstStartedPulling="2025-12-02 18:38:04.730264251 +0000 UTC m=+1394.419883132" lastFinishedPulling="2025-12-02 18:38:16.181826625 +0000 UTC m=+1405.871445506" observedRunningTime="2025-12-02 18:38:19.551829212 +0000 UTC m=+1409.241448093" watchObservedRunningTime="2025-12-02 18:38:19.561068019 +0000 UTC m=+1409.250686900" Dec 02 18:38:19 crc kubenswrapper[4878]: I1202 18:38:19.600493 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.534126 4878 generic.go:334] "Generic (PLEG): container finished" podID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerID="b1bf248f7736fc0e2a16f6369df8f9d6c3471c00c8d13baab8694b7da2bb40df" exitCode=0 Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.534659 4878 generic.go:334] "Generic (PLEG): container finished" podID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerID="8a2d380ab314e3f2d2461cba9a25983e81dce31646b910f0b5d13bb0db601bd0" exitCode=2 Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.534671 4878 generic.go:334] "Generic (PLEG): container finished" podID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerID="f19f8e8a27dedec47681745735d3fe1dc415d5584ee88c8706ebe7751d504659" exitCode=0 Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.534372 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerDied","Data":"b1bf248f7736fc0e2a16f6369df8f9d6c3471c00c8d13baab8694b7da2bb40df"} Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.534747 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerDied","Data":"8a2d380ab314e3f2d2461cba9a25983e81dce31646b910f0b5d13bb0db601bd0"} Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.534765 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerDied","Data":"f19f8e8a27dedec47681745735d3fe1dc415d5584ee88c8706ebe7751d504659"} Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.549331 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"0b151117f00145a299df29e73d10bc9a1d539afdd39e437c3a0457d7fcfc092c"} Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.587734 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59qgr" event={"ID":"ba4a8419-5172-47bf-a836-b88e734e919b","Type":"ContainerStarted","Data":"962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a"} Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.629222 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-59qgr" podStartSLOduration=4.778621508 podStartE2EDuration="15.629196371s" podCreationTimestamp="2025-12-02 18:38:05 +0000 UTC" firstStartedPulling="2025-12-02 18:38:08.995840727 +0000 UTC m=+1398.685459608" lastFinishedPulling="2025-12-02 18:38:19.8464156 +0000 UTC m=+1409.536034471" observedRunningTime="2025-12-02 18:38:20.614763773 +0000 UTC m=+1410.304382654" watchObservedRunningTime="2025-12-02 18:38:20.629196371 +0000 UTC m=+1410.318815252" Dec 02 18:38:20 crc kubenswrapper[4878]: I1202 18:38:20.974806 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.113581 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-sg-core-conf-yaml\") pod \"7df302e4-7d89-4c00-b517-d4dce032ad3d\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.113706 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-config-data\") pod \"7df302e4-7d89-4c00-b517-d4dce032ad3d\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.113731 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-combined-ca-bundle\") pod \"7df302e4-7d89-4c00-b517-d4dce032ad3d\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.113800 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gklnd\" (UniqueName: \"kubernetes.io/projected/7df302e4-7d89-4c00-b517-d4dce032ad3d-kube-api-access-gklnd\") pod \"7df302e4-7d89-4c00-b517-d4dce032ad3d\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.113877 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-run-httpd\") pod \"7df302e4-7d89-4c00-b517-d4dce032ad3d\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.113993 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-scripts\") pod \"7df302e4-7d89-4c00-b517-d4dce032ad3d\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.114339 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-log-httpd\") pod \"7df302e4-7d89-4c00-b517-d4dce032ad3d\" (UID: \"7df302e4-7d89-4c00-b517-d4dce032ad3d\") " Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.115539 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7df302e4-7d89-4c00-b517-d4dce032ad3d" (UID: "7df302e4-7d89-4c00-b517-d4dce032ad3d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.116769 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.117112 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7df302e4-7d89-4c00-b517-d4dce032ad3d" (UID: "7df302e4-7d89-4c00-b517-d4dce032ad3d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.122516 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df302e4-7d89-4c00-b517-d4dce032ad3d-kube-api-access-gklnd" (OuterVolumeSpecName: "kube-api-access-gklnd") pod "7df302e4-7d89-4c00-b517-d4dce032ad3d" (UID: "7df302e4-7d89-4c00-b517-d4dce032ad3d"). InnerVolumeSpecName "kube-api-access-gklnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.136423 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-scripts" (OuterVolumeSpecName: "scripts") pod "7df302e4-7d89-4c00-b517-d4dce032ad3d" (UID: "7df302e4-7d89-4c00-b517-d4dce032ad3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.190854 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7df302e4-7d89-4c00-b517-d4dce032ad3d" (UID: "7df302e4-7d89-4c00-b517-d4dce032ad3d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.212048 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-config-data" (OuterVolumeSpecName: "config-data") pod "7df302e4-7d89-4c00-b517-d4dce032ad3d" (UID: "7df302e4-7d89-4c00-b517-d4dce032ad3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.220000 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.220061 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7df302e4-7d89-4c00-b517-d4dce032ad3d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.220072 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.220082 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.220092 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gklnd\" (UniqueName: \"kubernetes.io/projected/7df302e4-7d89-4c00-b517-d4dce032ad3d-kube-api-access-gklnd\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.244914 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-68dcc9cf6f-whmvk" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.178:5353: i/o timeout" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.244984 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7df302e4-7d89-4c00-b517-d4dce032ad3d" (UID: "7df302e4-7d89-4c00-b517-d4dce032ad3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.323293 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df302e4-7d89-4c00-b517-d4dce032ad3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.608166 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.608262 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7df302e4-7d89-4c00-b517-d4dce032ad3d","Type":"ContainerDied","Data":"a3f3846b2d02998ab6a2221ad9cc0e2923ae9144ccb90a65819494e0c3548ff5"} Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.610729 4878 scope.go:117] "RemoveContainer" containerID="b1bf248f7736fc0e2a16f6369df8f9d6c3471c00c8d13baab8694b7da2bb40df" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.656367 4878 scope.go:117] "RemoveContainer" containerID="8a2d380ab314e3f2d2461cba9a25983e81dce31646b910f0b5d13bb0db601bd0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.691271 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.711683 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.733450 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:21 crc kubenswrapper[4878]: E1202 18:38:21.734211 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="sg-core" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734262 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="sg-core" Dec 02 18:38:21 crc kubenswrapper[4878]: E1202 18:38:21.734281 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" containerName="dnsmasq-dns" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734291 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" containerName="dnsmasq-dns" Dec 02 18:38:21 crc kubenswrapper[4878]: E1202 18:38:21.734304 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" containerName="heat-db-sync" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734313 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" containerName="heat-db-sync" Dec 02 18:38:21 crc kubenswrapper[4878]: E1202 18:38:21.734330 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="ceilometer-notification-agent" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734337 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="ceilometer-notification-agent" Dec 02 18:38:21 crc kubenswrapper[4878]: E1202 18:38:21.734373 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="proxy-httpd" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734381 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="proxy-httpd" Dec 02 18:38:21 crc kubenswrapper[4878]: E1202 18:38:21.734428 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" containerName="init" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734437 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" containerName="init" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734698 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="ceilometer-notification-agent" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734715 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="sg-core" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734732 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="55303a34-d76e-40fa-ba51-676df9fb7104" containerName="dnsmasq-dns" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734761 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" containerName="heat-db-sync" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.734777 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" containerName="proxy-httpd" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.739014 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.741910 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.742197 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.752368 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.779163 4878 scope.go:117] "RemoveContainer" containerID="f19f8e8a27dedec47681745735d3fe1dc415d5584ee88c8706ebe7751d504659" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.867893 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfc8\" (UniqueName: \"kubernetes.io/projected/381b845a-80e0-4848-a5b6-f125d9d0cc60-kube-api-access-pkfc8\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.868005 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-config-data\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.868049 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-run-httpd\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.868084 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.868125 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.868153 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-log-httpd\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.868168 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-scripts\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.971892 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-log-httpd\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.972433 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-scripts\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.972582 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-log-httpd\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.972854 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfc8\" (UniqueName: \"kubernetes.io/projected/381b845a-80e0-4848-a5b6-f125d9d0cc60-kube-api-access-pkfc8\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.973189 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-config-data\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.973252 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-run-httpd\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.973357 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.973392 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.974038 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-run-httpd\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.983635 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-config-data\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.983711 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.984721 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.986757 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-scripts\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:21 crc kubenswrapper[4878]: I1202 18:38:21.998787 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfc8\" (UniqueName: \"kubernetes.io/projected/381b845a-80e0-4848-a5b6-f125d9d0cc60-kube-api-access-pkfc8\") pod \"ceilometer-0\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " pod="openstack/ceilometer-0" Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.075759 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.600229 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9d785686f-gqxnz" Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.709941 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.733831 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-749d96bfb4-7zqk4"] Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.734539 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-749d96bfb4-7zqk4" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-api" containerID="cri-o://bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f" gracePeriod=30 Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.734892 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-749d96bfb4-7zqk4" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-httpd" containerID="cri-o://8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf" gracePeriod=30 Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.766441 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"afba054b9a3ad6541018bc250c53af415794bc81dafd5d4ab5e7740038bd1da8"} Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.766513 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"e7c3c93bed1acc90a9c87b54051273e3f95750a0a56b288ad1f315812bae37e2"} Dec 02 18:38:22 crc kubenswrapper[4878]: I1202 18:38:22.962780 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df302e4-7d89-4c00-b517-d4dce032ad3d" path="/var/lib/kubelet/pods/7df302e4-7d89-4c00-b517-d4dce032ad3d/volumes" Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.746350 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.746715 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.841791 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"17bae97c622d296fce7feb800bbc2ced356faee0419bee76b77f9548cc188a5e"} Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.841847 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"dce824dd5504d11ab3de88fa12aba4416db28d7bcc623dc39976ebc3eb5b5e0c"} Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.841860 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"de3ee6132e775341824de5fac4521af18d34c0427220b6cdf397812be55b0ac1"} Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.841870 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"57455186e2ba73da6294f65c2a9099c6b931a9b6f8541ca07d42f37a5219ca8b"} Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.844960 4878 generic.go:334] "Generic (PLEG): container finished" podID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerID="8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf" exitCode=0 Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.845021 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749d96bfb4-7zqk4" event={"ID":"3f5ffe5f-5be4-4103-864e-56d17ac72a2d","Type":"ContainerDied","Data":"8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf"} Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.848840 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerStarted","Data":"cba5c992af90a860edb1e71507dee2592ba3ec291c6acf5af10c0d5d056f5420"} Dec 02 18:38:23 crc kubenswrapper[4878]: I1202 18:38:23.848902 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerStarted","Data":"6c28dfa458319d251cadfd33c4b01d054ce2ffee1e8f055d419ea725107cee9a"} Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.374536 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.374889 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.464003 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.470021 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.863157 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerStarted","Data":"ffa5d6769d2f8e9ae8403272366b710f29b469605473dc2b96d03758fbf74656"} Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.872748 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f","Type":"ContainerStarted","Data":"dbe1fb795eb8d54e05d18239b5ec2bb999d6de07bcb06cddb58b5bb866d1dc6a"} Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.873607 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.873669 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 18:38:24 crc kubenswrapper[4878]: I1202 18:38:24.923009 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=91.227713023 podStartE2EDuration="1m59.922980923s" podCreationTimestamp="2025-12-02 18:36:25 +0000 UTC" firstStartedPulling="2025-12-02 18:37:52.935166569 +0000 UTC m=+1382.624785450" lastFinishedPulling="2025-12-02 18:38:21.630434469 +0000 UTC m=+1411.320053350" observedRunningTime="2025-12-02 18:38:24.918153864 +0000 UTC m=+1414.607772755" watchObservedRunningTime="2025-12-02 18:38:24.922980923 +0000 UTC m=+1414.612599814" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.262981 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64667c4f57-zgjd6"] Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.293586 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.306425 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.332970 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64667c4f57-zgjd6"] Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.346521 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb46x\" (UniqueName: \"kubernetes.io/projected/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-kube-api-access-cb46x\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.347073 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-swift-storage-0\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.347472 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-svc\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.347598 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-nb\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.347704 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-config\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.347813 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-sb\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.347884 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.450660 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb46x\" (UniqueName: \"kubernetes.io/projected/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-kube-api-access-cb46x\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.451164 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-swift-storage-0\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.451392 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-svc\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.451491 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-nb\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.451617 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-config\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.451728 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-sb\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.453065 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-sb\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.453108 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-svc\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.453313 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-config\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.454026 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-nb\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.457853 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-swift-storage-0\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.476069 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb46x\" (UniqueName: \"kubernetes.io/projected/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-kube-api-access-cb46x\") pod \"dnsmasq-dns-64667c4f57-zgjd6\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.688320 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:25 crc kubenswrapper[4878]: I1202 18:38:25.937640 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerStarted","Data":"5bb8357ad01c4521f791b9314dca2054ded3eb44444c45f02438ff7a1cc618e9"} Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.191401 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.194931 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.265695 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.289954 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64667c4f57-zgjd6"] Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.701687 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.797329 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-config\") pod \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.797919 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmkmk\" (UniqueName: \"kubernetes.io/projected/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-kube-api-access-dmkmk\") pod \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.797971 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-combined-ca-bundle\") pod \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.798275 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-ovndb-tls-certs\") pod \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.798479 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-httpd-config\") pod \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\" (UID: \"3f5ffe5f-5be4-4103-864e-56d17ac72a2d\") " Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.826789 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-kube-api-access-dmkmk" (OuterVolumeSpecName: "kube-api-access-dmkmk") pod "3f5ffe5f-5be4-4103-864e-56d17ac72a2d" (UID: "3f5ffe5f-5be4-4103-864e-56d17ac72a2d"). InnerVolumeSpecName "kube-api-access-dmkmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.834664 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3f5ffe5f-5be4-4103-864e-56d17ac72a2d" (UID: "3f5ffe5f-5be4-4103-864e-56d17ac72a2d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.906987 4878 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.907261 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmkmk\" (UniqueName: \"kubernetes.io/projected/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-kube-api-access-dmkmk\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.925579 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f5ffe5f-5be4-4103-864e-56d17ac72a2d" (UID: "3f5ffe5f-5be4-4103-864e-56d17ac72a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.935808 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3f5ffe5f-5be4-4103-864e-56d17ac72a2d" (UID: "3f5ffe5f-5be4-4103-864e-56d17ac72a2d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.956823 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-config" (OuterVolumeSpecName: "config") pod "3f5ffe5f-5be4-4103-864e-56d17ac72a2d" (UID: "3f5ffe5f-5be4-4103-864e-56d17ac72a2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.975118 4878 generic.go:334] "Generic (PLEG): container finished" podID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerID="bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f" exitCode=0 Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.975252 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749d96bfb4-7zqk4" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.976996 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" event={"ID":"9aca3209-3fd0-41cb-8bf6-5fe729d0547a","Type":"ContainerStarted","Data":"65d24f5b84fc2da61e46c3e0e84e4bc9c3eb3ef2d7cde58a4f3fba67febcc93f"} Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.977124 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749d96bfb4-7zqk4" event={"ID":"3f5ffe5f-5be4-4103-864e-56d17ac72a2d","Type":"ContainerDied","Data":"bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f"} Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.977256 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749d96bfb4-7zqk4" event={"ID":"3f5ffe5f-5be4-4103-864e-56d17ac72a2d","Type":"ContainerDied","Data":"9d68b6ea93a79d4d01c9cf3afd8074cd911fb5f325f279a559a6f1d6a48ec5b2"} Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.977342 4878 scope.go:117] "RemoveContainer" containerID="8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf" Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.979827 4878 generic.go:334] "Generic (PLEG): container finished" podID="d7252936-ed87-47f7-b392-7d8fe8388279" containerID="bea3d4d9b0e638bd1c0e1ca7227f69f3d655d8f3beaeabc0bd0056f9a2690148" exitCode=0 Dec 02 18:38:26 crc kubenswrapper[4878]: I1202 18:38:26.980344 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qqjx" event={"ID":"d7252936-ed87-47f7-b392-7d8fe8388279","Type":"ContainerDied","Data":"bea3d4d9b0e638bd1c0e1ca7227f69f3d655d8f3beaeabc0bd0056f9a2690148"} Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.018746 4878 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.018782 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.018794 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5ffe5f-5be4-4103-864e-56d17ac72a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.050586 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-749d96bfb4-7zqk4"] Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.054569 4878 scope.go:117] "RemoveContainer" containerID="bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.054713 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.060649 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-749d96bfb4-7zqk4"] Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.103306 4878 scope.go:117] "RemoveContainer" containerID="8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf" Dec 02 18:38:27 crc kubenswrapper[4878]: E1202 18:38:27.105547 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf\": container with ID starting with 8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf not found: ID does not exist" containerID="8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.105591 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf"} err="failed to get container status \"8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf\": rpc error: code = NotFound desc = could not find container \"8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf\": container with ID starting with 8aa8c86cf7ffc36c10872506db9465be039aee7c0452b05725b1225cdbde2ecf not found: ID does not exist" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.105620 4878 scope.go:117] "RemoveContainer" containerID="bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f" Dec 02 18:38:27 crc kubenswrapper[4878]: E1202 18:38:27.106387 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f\": container with ID starting with bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f not found: ID does not exist" containerID="bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.106417 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f"} err="failed to get container status \"bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f\": rpc error: code = NotFound desc = could not find container \"bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f\": container with ID starting with bae9ba13045d2763864e7dda241566a33276a4444577fe476a2421c99c332b3f not found: ID does not exist" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.326203 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-59qgr"] Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.614800 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2p669" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="registry-server" probeResult="failure" output=< Dec 02 18:38:27 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 18:38:27 crc kubenswrapper[4878]: > Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.752872 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.753000 4878 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.764697 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.877896 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58bf7d9584-2nldp" Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.981297 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69687b58c4-bbvhw"] Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.982385 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69687b58c4-bbvhw" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api-log" containerID="cri-o://272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49" gracePeriod=30 Dec 02 18:38:27 crc kubenswrapper[4878]: I1202 18:38:27.982587 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69687b58c4-bbvhw" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api" containerID="cri-o://0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9" gracePeriod=30 Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.046869 4878 generic.go:334] "Generic (PLEG): container finished" podID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerID="60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8" exitCode=0 Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.046994 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" event={"ID":"9aca3209-3fd0-41cb-8bf6-5fe729d0547a","Type":"ContainerDied","Data":"60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8"} Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.057851 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerStarted","Data":"2549639c96776d6df03159db76377922bf774a26eef4efaeb13efbf42de94983"} Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.059194 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.127562 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.055063812 podStartE2EDuration="7.127521208s" podCreationTimestamp="2025-12-02 18:38:21 +0000 UTC" firstStartedPulling="2025-12-02 18:38:22.674469556 +0000 UTC m=+1412.364088437" lastFinishedPulling="2025-12-02 18:38:26.746926952 +0000 UTC m=+1416.436545833" observedRunningTime="2025-12-02 18:38:28.108041554 +0000 UTC m=+1417.797660435" watchObservedRunningTime="2025-12-02 18:38:28.127521208 +0000 UTC m=+1417.817140089" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.644067 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.775481 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj8s8\" (UniqueName: \"kubernetes.io/projected/d7252936-ed87-47f7-b392-7d8fe8388279-kube-api-access-qj8s8\") pod \"d7252936-ed87-47f7-b392-7d8fe8388279\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.775696 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-config-data\") pod \"d7252936-ed87-47f7-b392-7d8fe8388279\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.775860 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-scripts\") pod \"d7252936-ed87-47f7-b392-7d8fe8388279\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.775895 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-db-sync-config-data\") pod \"d7252936-ed87-47f7-b392-7d8fe8388279\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.775925 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-combined-ca-bundle\") pod \"d7252936-ed87-47f7-b392-7d8fe8388279\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.775960 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7252936-ed87-47f7-b392-7d8fe8388279-etc-machine-id\") pod \"d7252936-ed87-47f7-b392-7d8fe8388279\" (UID: \"d7252936-ed87-47f7-b392-7d8fe8388279\") " Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.776575 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7252936-ed87-47f7-b392-7d8fe8388279-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d7252936-ed87-47f7-b392-7d8fe8388279" (UID: "d7252936-ed87-47f7-b392-7d8fe8388279"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.780918 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d7252936-ed87-47f7-b392-7d8fe8388279" (UID: "d7252936-ed87-47f7-b392-7d8fe8388279"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.782044 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-scripts" (OuterVolumeSpecName: "scripts") pod "d7252936-ed87-47f7-b392-7d8fe8388279" (UID: "d7252936-ed87-47f7-b392-7d8fe8388279"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.782342 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7252936-ed87-47f7-b392-7d8fe8388279-kube-api-access-qj8s8" (OuterVolumeSpecName: "kube-api-access-qj8s8") pod "d7252936-ed87-47f7-b392-7d8fe8388279" (UID: "d7252936-ed87-47f7-b392-7d8fe8388279"). InnerVolumeSpecName "kube-api-access-qj8s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.825346 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7252936-ed87-47f7-b392-7d8fe8388279" (UID: "d7252936-ed87-47f7-b392-7d8fe8388279"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.846617 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-config-data" (OuterVolumeSpecName: "config-data") pod "d7252936-ed87-47f7-b392-7d8fe8388279" (UID: "d7252936-ed87-47f7-b392-7d8fe8388279"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.881706 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj8s8\" (UniqueName: \"kubernetes.io/projected/d7252936-ed87-47f7-b392-7d8fe8388279-kube-api-access-qj8s8\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.881745 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.881757 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.881767 4878 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.881778 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7252936-ed87-47f7-b392-7d8fe8388279-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.881787 4878 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7252936-ed87-47f7-b392-7d8fe8388279-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:28 crc kubenswrapper[4878]: I1202 18:38:28.956002 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" path="/var/lib/kubelet/pods/3f5ffe5f-5be4-4103-864e-56d17ac72a2d/volumes" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.069282 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qqjx" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.069319 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qqjx" event={"ID":"d7252936-ed87-47f7-b392-7d8fe8388279","Type":"ContainerDied","Data":"9a58340230eea76d23d52a489584d0ccb67c644ac84b3e80ddc20343bf8a6867"} Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.069455 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a58340230eea76d23d52a489584d0ccb67c644ac84b3e80ddc20343bf8a6867" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.071645 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" event={"ID":"9aca3209-3fd0-41cb-8bf6-5fe729d0547a","Type":"ContainerStarted","Data":"ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5"} Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.074474 4878 generic.go:334] "Generic (PLEG): container finished" podID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerID="272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49" exitCode=143 Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.074555 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69687b58c4-bbvhw" event={"ID":"09c6c49a-fb99-4649-9a4e-a290070d77e7","Type":"ContainerDied","Data":"272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49"} Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.074703 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-59qgr" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="registry-server" containerID="cri-o://962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a" gracePeriod=2 Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.097062 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" podStartSLOduration=4.097045261 podStartE2EDuration="4.097045261s" podCreationTimestamp="2025-12-02 18:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:29.092380807 +0000 UTC m=+1418.781999698" watchObservedRunningTime="2025-12-02 18:38:29.097045261 +0000 UTC m=+1418.786664142" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.335293 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:29 crc kubenswrapper[4878]: E1202 18:38:29.336388 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-httpd" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.336412 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-httpd" Dec 02 18:38:29 crc kubenswrapper[4878]: E1202 18:38:29.336445 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-api" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.336457 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-api" Dec 02 18:38:29 crc kubenswrapper[4878]: E1202 18:38:29.336503 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7252936-ed87-47f7-b392-7d8fe8388279" containerName="cinder-db-sync" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.336513 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7252936-ed87-47f7-b392-7d8fe8388279" containerName="cinder-db-sync" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.336777 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-api" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.336804 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f5ffe5f-5be4-4103-864e-56d17ac72a2d" containerName="neutron-httpd" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.336830 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7252936-ed87-47f7-b392-7d8fe8388279" containerName="cinder-db-sync" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.338533 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.350404 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.350845 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.350978 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.351064 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tdkk4" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.381497 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.397192 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.397334 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.397410 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.397481 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.397563 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzt86\" (UniqueName: \"kubernetes.io/projected/7b3db6e3-7437-4bea-8d93-53b8586cac40-kube-api-access-nzt86\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.397607 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b3db6e3-7437-4bea-8d93-53b8586cac40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.493670 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64667c4f57-zgjd6"] Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.499227 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzt86\" (UniqueName: \"kubernetes.io/projected/7b3db6e3-7437-4bea-8d93-53b8586cac40-kube-api-access-nzt86\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.499326 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b3db6e3-7437-4bea-8d93-53b8586cac40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.499397 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.499456 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.499513 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.499579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.508008 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.508594 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b3db6e3-7437-4bea-8d93-53b8586cac40-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.521362 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.521797 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.522710 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.565791 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzt86\" (UniqueName: \"kubernetes.io/projected/7b3db6e3-7437-4bea-8d93-53b8586cac40-kube-api-access-nzt86\") pod \"cinder-scheduler-0\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.646320 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65bc8f75b9-pjhzn"] Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.649667 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.706373 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bc8f75b9-pjhzn"] Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.707939 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-nb\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.707979 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-config\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.708035 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-swift-storage-0\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.708055 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-svc\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.708088 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn82b\" (UniqueName: \"kubernetes.io/projected/f6e18d8b-bd8c-420d-b9af-84762e4c808e-kube-api-access-mn82b\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.708113 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-sb\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.730933 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.810871 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-nb\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.810937 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-config\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.810981 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-swift-storage-0\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.811000 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-svc\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.811028 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn82b\" (UniqueName: \"kubernetes.io/projected/f6e18d8b-bd8c-420d-b9af-84762e4c808e-kube-api-access-mn82b\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.811051 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-sb\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.812074 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-sb\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.812613 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-nb\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.813159 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-config\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.813680 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-swift-storage-0\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.814177 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-svc\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.885505 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.894597 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.907382 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn82b\" (UniqueName: \"kubernetes.io/projected/f6e18d8b-bd8c-420d-b9af-84762e4c808e-kube-api-access-mn82b\") pod \"dnsmasq-dns-65bc8f75b9-pjhzn\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.907767 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 18:38:29 crc kubenswrapper[4878]: I1202 18:38:29.947905 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.018468 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.019201 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.019304 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk75j\" (UniqueName: \"kubernetes.io/projected/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-kube-api-access-zk75j\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.019378 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-logs\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.019478 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.019559 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.019632 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-scripts\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.021272 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.063213 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.136232 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-utilities\") pod \"ba4a8419-5172-47bf-a836-b88e734e919b\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.136773 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt4n8\" (UniqueName: \"kubernetes.io/projected/ba4a8419-5172-47bf-a836-b88e734e919b-kube-api-access-kt4n8\") pod \"ba4a8419-5172-47bf-a836-b88e734e919b\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.136939 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-catalog-content\") pod \"ba4a8419-5172-47bf-a836-b88e734e919b\" (UID: \"ba4a8419-5172-47bf-a836-b88e734e919b\") " Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137433 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137472 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk75j\" (UniqueName: \"kubernetes.io/projected/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-kube-api-access-zk75j\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137507 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-logs\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137598 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137629 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-scripts\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137756 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.137996 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.139512 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-logs\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.141151 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-utilities" (OuterVolumeSpecName: "utilities") pod "ba4a8419-5172-47bf-a836-b88e734e919b" (UID: "ba4a8419-5172-47bf-a836-b88e734e919b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.163216 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.163560 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-scripts\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.164103 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.164633 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4a8419-5172-47bf-a836-b88e734e919b-kube-api-access-kt4n8" (OuterVolumeSpecName: "kube-api-access-kt4n8") pod "ba4a8419-5172-47bf-a836-b88e734e919b" (UID: "ba4a8419-5172-47bf-a836-b88e734e919b"). InnerVolumeSpecName "kube-api-access-kt4n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.164696 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.206818 4878 generic.go:334] "Generic (PLEG): container finished" podID="ba4a8419-5172-47bf-a836-b88e734e919b" containerID="962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a" exitCode=0 Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.207252 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59qgr" event={"ID":"ba4a8419-5172-47bf-a836-b88e734e919b","Type":"ContainerDied","Data":"962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a"} Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.207316 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59qgr" event={"ID":"ba4a8419-5172-47bf-a836-b88e734e919b","Type":"ContainerDied","Data":"74c717657858c80380ba3fd2c7f3922a0a643269a490c42d7c86d621e3b37915"} Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.207368 4878 scope.go:117] "RemoveContainer" containerID="962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.208859 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59qgr" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.211125 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk75j\" (UniqueName: \"kubernetes.io/projected/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-kube-api-access-zk75j\") pod \"cinder-api-0\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.211196 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.263034 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.263363 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt4n8\" (UniqueName: \"kubernetes.io/projected/ba4a8419-5172-47bf-a836-b88e734e919b-kube-api-access-kt4n8\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.267188 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba4a8419-5172-47bf-a836-b88e734e919b" (UID: "ba4a8419-5172-47bf-a836-b88e734e919b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.278530 4878 scope.go:117] "RemoveContainer" containerID="c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.308911 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.353968 4878 scope.go:117] "RemoveContainer" containerID="f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.370985 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba4a8419-5172-47bf-a836-b88e734e919b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.450422 4878 scope.go:117] "RemoveContainer" containerID="962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a" Dec 02 18:38:30 crc kubenswrapper[4878]: E1202 18:38:30.452078 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a\": container with ID starting with 962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a not found: ID does not exist" containerID="962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.452133 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a"} err="failed to get container status \"962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a\": rpc error: code = NotFound desc = could not find container \"962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a\": container with ID starting with 962cbf614f06a7f72cda21e5e20fc8673de1d36c0eba601c35a6237c3b1b8a6a not found: ID does not exist" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.452164 4878 scope.go:117] "RemoveContainer" containerID="c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60" Dec 02 18:38:30 crc kubenswrapper[4878]: E1202 18:38:30.454716 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60\": container with ID starting with c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60 not found: ID does not exist" containerID="c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.454766 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60"} err="failed to get container status \"c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60\": rpc error: code = NotFound desc = could not find container \"c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60\": container with ID starting with c377fa2779ac3fdf18155ea193548d81162747a390e6d2aee86e829be4adeb60 not found: ID does not exist" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.454799 4878 scope.go:117] "RemoveContainer" containerID="f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba" Dec 02 18:38:30 crc kubenswrapper[4878]: E1202 18:38:30.455293 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba\": container with ID starting with f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba not found: ID does not exist" containerID="f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.455327 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba"} err="failed to get container status \"f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba\": rpc error: code = NotFound desc = could not find container \"f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba\": container with ID starting with f6a3d190e80818be1dd5f2dd65ddaf1f87cafa16ce13243f0e33cb2f3d8a68ba not found: ID does not exist" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.531904 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:30 crc kubenswrapper[4878]: W1202 18:38:30.533449 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b3db6e3_7437_4bea_8d93_53b8586cac40.slice/crio-5f3bd674799ea7de6404941d34582f022c70c3700ba1a1136c521431666b54f0 WatchSource:0}: Error finding container 5f3bd674799ea7de6404941d34582f022c70c3700ba1a1136c521431666b54f0: Status 404 returned error can't find the container with id 5f3bd674799ea7de6404941d34582f022c70c3700ba1a1136c521431666b54f0 Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.575921 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-59qgr"] Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.591888 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-59qgr"] Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.776211 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65bc8f75b9-pjhzn"] Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.986614 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" path="/var/lib/kubelet/pods/ba4a8419-5172-47bf-a836-b88e734e919b/volumes" Dec 02 18:38:30 crc kubenswrapper[4878]: I1202 18:38:30.987691 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.225685 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2","Type":"ContainerStarted","Data":"79a7f02e3615c8aa5d12d13f35e60bd33dc9fe7fc65bec3de73dc4c1aa689470"} Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.231847 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b3db6e3-7437-4bea-8d93-53b8586cac40","Type":"ContainerStarted","Data":"5f3bd674799ea7de6404941d34582f022c70c3700ba1a1136c521431666b54f0"} Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.234062 4878 generic.go:334] "Generic (PLEG): container finished" podID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerID="af3b53b1de0e8858946297effb75fa04788e62d8f5eb022e30a201dda278de24" exitCode=0 Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.234297 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" podUID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerName="dnsmasq-dns" containerID="cri-o://ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5" gracePeriod=10 Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.234416 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" event={"ID":"f6e18d8b-bd8c-420d-b9af-84762e4c808e","Type":"ContainerDied","Data":"af3b53b1de0e8858946297effb75fa04788e62d8f5eb022e30a201dda278de24"} Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.234446 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" event={"ID":"f6e18d8b-bd8c-420d-b9af-84762e4c808e","Type":"ContainerStarted","Data":"f96c934d53780d406ec0d05f3ee6f37e6ae09640edf7c92acacc4a3e6289c4d3"} Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.493458 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69687b58c4-bbvhw" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.193:9311/healthcheck\": read tcp 10.217.0.2:56374->10.217.0.193:9311: read: connection reset by peer" Dec 02 18:38:31 crc kubenswrapper[4878]: I1202 18:38:31.493470 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69687b58c4-bbvhw" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.193:9311/healthcheck\": read tcp 10.217.0.2:56388->10.217.0.193:9311: read: connection reset by peer" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.161526 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.246195 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb46x\" (UniqueName: \"kubernetes.io/projected/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-kube-api-access-cb46x\") pod \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.246285 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-swift-storage-0\") pod \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.246366 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-config\") pod \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.246418 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-nb\") pod \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.246455 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-sb\") pod \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.246548 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-svc\") pod \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\" (UID: \"9aca3209-3fd0-41cb-8bf6-5fe729d0547a\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.273593 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-kube-api-access-cb46x" (OuterVolumeSpecName: "kube-api-access-cb46x") pod "9aca3209-3fd0-41cb-8bf6-5fe729d0547a" (UID: "9aca3209-3fd0-41cb-8bf6-5fe729d0547a"). InnerVolumeSpecName "kube-api-access-cb46x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.273993 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.320501 4878 generic.go:334] "Generic (PLEG): container finished" podID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerID="0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9" exitCode=0 Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.320841 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69687b58c4-bbvhw" event={"ID":"09c6c49a-fb99-4649-9a4e-a290070d77e7","Type":"ContainerDied","Data":"0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9"} Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.320872 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69687b58c4-bbvhw" event={"ID":"09c6c49a-fb99-4649-9a4e-a290070d77e7","Type":"ContainerDied","Data":"8f7c0523d134294a19cbace6c37fdbea5577f28bb7167abf78da01a71cc2bf36"} Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.320890 4878 scope.go:117] "RemoveContainer" containerID="0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.321009 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69687b58c4-bbvhw" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.326032 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9aca3209-3fd0-41cb-8bf6-5fe729d0547a" (UID: "9aca3209-3fd0-41cb-8bf6-5fe729d0547a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.339927 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" event={"ID":"f6e18d8b-bd8c-420d-b9af-84762e4c808e","Type":"ContainerStarted","Data":"b78338e54cfc32f357a9923a1bc152544efe7935d8a007575cadba7c56cc8501"} Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.342288 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.350740 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09c6c49a-fb99-4649-9a4e-a290070d77e7-logs\") pod \"09c6c49a-fb99-4649-9a4e-a290070d77e7\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.350984 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-combined-ca-bundle\") pod \"09c6c49a-fb99-4649-9a4e-a290070d77e7\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.351221 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data\") pod \"09c6c49a-fb99-4649-9a4e-a290070d77e7\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.351322 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data-custom\") pod \"09c6c49a-fb99-4649-9a4e-a290070d77e7\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.351394 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t25jh\" (UniqueName: \"kubernetes.io/projected/09c6c49a-fb99-4649-9a4e-a290070d77e7-kube-api-access-t25jh\") pod \"09c6c49a-fb99-4649-9a4e-a290070d77e7\" (UID: \"09c6c49a-fb99-4649-9a4e-a290070d77e7\") " Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.352450 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb46x\" (UniqueName: \"kubernetes.io/projected/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-kube-api-access-cb46x\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.352465 4878 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.357717 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c6c49a-fb99-4649-9a4e-a290070d77e7-kube-api-access-t25jh" (OuterVolumeSpecName: "kube-api-access-t25jh") pod "09c6c49a-fb99-4649-9a4e-a290070d77e7" (UID: "09c6c49a-fb99-4649-9a4e-a290070d77e7"). InnerVolumeSpecName "kube-api-access-t25jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.362731 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09c6c49a-fb99-4649-9a4e-a290070d77e7-logs" (OuterVolumeSpecName: "logs") pod "09c6c49a-fb99-4649-9a4e-a290070d77e7" (UID: "09c6c49a-fb99-4649-9a4e-a290070d77e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.368357 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2","Type":"ContainerStarted","Data":"012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b"} Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.381298 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" podStartSLOduration=3.381278038 podStartE2EDuration="3.381278038s" podCreationTimestamp="2025-12-02 18:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:32.367588414 +0000 UTC m=+1422.057207295" watchObservedRunningTime="2025-12-02 18:38:32.381278038 +0000 UTC m=+1422.070896919" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.393582 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "09c6c49a-fb99-4649-9a4e-a290070d77e7" (UID: "09c6c49a-fb99-4649-9a4e-a290070d77e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.396710 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aca3209-3fd0-41cb-8bf6-5fe729d0547a" (UID: "9aca3209-3fd0-41cb-8bf6-5fe729d0547a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.403097 4878 generic.go:334] "Generic (PLEG): container finished" podID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerID="ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5" exitCode=0 Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.403166 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" event={"ID":"9aca3209-3fd0-41cb-8bf6-5fe729d0547a","Type":"ContainerDied","Data":"ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5"} Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.403199 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" event={"ID":"9aca3209-3fd0-41cb-8bf6-5fe729d0547a","Type":"ContainerDied","Data":"65d24f5b84fc2da61e46c3e0e84e4bc9c3eb3ef2d7cde58a4f3fba67febcc93f"} Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.403293 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64667c4f57-zgjd6" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.418360 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9aca3209-3fd0-41cb-8bf6-5fe729d0547a" (UID: "9aca3209-3fd0-41cb-8bf6-5fe729d0547a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.441000 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9aca3209-3fd0-41cb-8bf6-5fe729d0547a" (UID: "9aca3209-3fd0-41cb-8bf6-5fe729d0547a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.446875 4878 scope.go:117] "RemoveContainer" containerID="272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.455455 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.455480 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t25jh\" (UniqueName: \"kubernetes.io/projected/09c6c49a-fb99-4649-9a4e-a290070d77e7-kube-api-access-t25jh\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.455492 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.455501 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.455510 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09c6c49a-fb99-4649-9a4e-a290070d77e7-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.455541 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.459618 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09c6c49a-fb99-4649-9a4e-a290070d77e7" (UID: "09c6c49a-fb99-4649-9a4e-a290070d77e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.485198 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data" (OuterVolumeSpecName: "config-data") pod "09c6c49a-fb99-4649-9a4e-a290070d77e7" (UID: "09c6c49a-fb99-4649-9a4e-a290070d77e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.491002 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-config" (OuterVolumeSpecName: "config") pod "9aca3209-3fd0-41cb-8bf6-5fe729d0547a" (UID: "9aca3209-3fd0-41cb-8bf6-5fe729d0547a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.553696 4878 scope.go:117] "RemoveContainer" containerID="0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9" Dec 02 18:38:32 crc kubenswrapper[4878]: E1202 18:38:32.554230 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9\": container with ID starting with 0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9 not found: ID does not exist" containerID="0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.554287 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9"} err="failed to get container status \"0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9\": rpc error: code = NotFound desc = could not find container \"0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9\": container with ID starting with 0e9175d8ee94260782e8e953950083dfb524d40c3b7f64936234f48001f85ff9 not found: ID does not exist" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.555852 4878 scope.go:117] "RemoveContainer" containerID="272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.559268 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.559314 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aca3209-3fd0-41cb-8bf6-5fe729d0547a-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.559325 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c6c49a-fb99-4649-9a4e-a290070d77e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:32 crc kubenswrapper[4878]: E1202 18:38:32.559942 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49\": container with ID starting with 272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49 not found: ID does not exist" containerID="272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.560006 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49"} err="failed to get container status \"272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49\": rpc error: code = NotFound desc = could not find container \"272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49\": container with ID starting with 272226cef86fad277da895ef1012a6483832788598d7f3469ada7ff9abd42f49 not found: ID does not exist" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.560045 4878 scope.go:117] "RemoveContainer" containerID="ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.612678 4878 scope.go:117] "RemoveContainer" containerID="60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.751533 4878 scope.go:117] "RemoveContainer" containerID="ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5" Dec 02 18:38:32 crc kubenswrapper[4878]: E1202 18:38:32.760093 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5\": container with ID starting with ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5 not found: ID does not exist" containerID="ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.760152 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5"} err="failed to get container status \"ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5\": rpc error: code = NotFound desc = could not find container \"ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5\": container with ID starting with ab5619e25538892a2d9db40958a4308c048724727ce02a875232589d935881f5 not found: ID does not exist" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.760217 4878 scope.go:117] "RemoveContainer" containerID="60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8" Dec 02 18:38:32 crc kubenswrapper[4878]: E1202 18:38:32.761390 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8\": container with ID starting with 60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8 not found: ID does not exist" containerID="60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.762373 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8"} err="failed to get container status \"60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8\": rpc error: code = NotFound desc = could not find container \"60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8\": container with ID starting with 60430abc92b052e2774f1619fe0cef30870571e0b0864eebef93b01fe57ec0c8 not found: ID does not exist" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.810378 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69687b58c4-bbvhw"] Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.824302 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-69687b58c4-bbvhw"] Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.837075 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64667c4f57-zgjd6"] Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.847011 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64667c4f57-zgjd6"] Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.958706 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" path="/var/lib/kubelet/pods/09c6c49a-fb99-4649-9a4e-a290070d77e7/volumes" Dec 02 18:38:32 crc kubenswrapper[4878]: I1202 18:38:32.960812 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" path="/var/lib/kubelet/pods/9aca3209-3fd0-41cb-8bf6-5fe729d0547a/volumes" Dec 02 18:38:33 crc kubenswrapper[4878]: I1202 18:38:33.142914 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:38:33 crc kubenswrapper[4878]: I1202 18:38:33.428732 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b3db6e3-7437-4bea-8d93-53b8586cac40","Type":"ContainerStarted","Data":"990a360ba1eba09e51600934b2eded7ec476f6fa9a887f2e5fc0158a8a29e396"} Dec 02 18:38:33 crc kubenswrapper[4878]: I1202 18:38:33.437559 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2","Type":"ContainerStarted","Data":"5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de"} Dec 02 18:38:33 crc kubenswrapper[4878]: I1202 18:38:33.437738 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api-log" containerID="cri-o://012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b" gracePeriod=30 Dec 02 18:38:33 crc kubenswrapper[4878]: I1202 18:38:33.437990 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 18:38:33 crc kubenswrapper[4878]: I1202 18:38:33.438362 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api" containerID="cri-o://5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de" gracePeriod=30 Dec 02 18:38:33 crc kubenswrapper[4878]: I1202 18:38:33.473024 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.473005823 podStartE2EDuration="4.473005823s" podCreationTimestamp="2025-12-02 18:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:33.472715754 +0000 UTC m=+1423.162334635" watchObservedRunningTime="2025-12-02 18:38:33.473005823 +0000 UTC m=+1423.162624704" Dec 02 18:38:34 crc kubenswrapper[4878]: I1202 18:38:34.484149 4878 generic.go:334] "Generic (PLEG): container finished" podID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerID="012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b" exitCode=143 Dec 02 18:38:34 crc kubenswrapper[4878]: I1202 18:38:34.484552 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2","Type":"ContainerDied","Data":"012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b"} Dec 02 18:38:34 crc kubenswrapper[4878]: I1202 18:38:34.492033 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b3db6e3-7437-4bea-8d93-53b8586cac40","Type":"ContainerStarted","Data":"88e6ae84eaca3525507e67dc0c2d7a57e8c4745c15067ab748fc6326b5ea0a53"} Dec 02 18:38:34 crc kubenswrapper[4878]: I1202 18:38:34.528662 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.422443524 podStartE2EDuration="5.528639059s" podCreationTimestamp="2025-12-02 18:38:29 +0000 UTC" firstStartedPulling="2025-12-02 18:38:30.536205974 +0000 UTC m=+1420.225824855" lastFinishedPulling="2025-12-02 18:38:31.642401509 +0000 UTC m=+1421.332020390" observedRunningTime="2025-12-02 18:38:34.519085913 +0000 UTC m=+1424.208704804" watchObservedRunningTime="2025-12-02 18:38:34.528639059 +0000 UTC m=+1424.218257940" Dec 02 18:38:34 crc kubenswrapper[4878]: I1202 18:38:34.732036 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 18:38:34 crc kubenswrapper[4878]: I1202 18:38:34.928033 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:34 crc kubenswrapper[4878]: I1202 18:38:34.965974 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bbd45c784-zz4hz" Dec 02 18:38:36 crc kubenswrapper[4878]: I1202 18:38:36.376068 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-67bf9d8f54-s7vnk" Dec 02 18:38:36 crc kubenswrapper[4878]: I1202 18:38:36.703576 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:38:36 crc kubenswrapper[4878]: I1202 18:38:36.810145 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:38:37 crc kubenswrapper[4878]: I1202 18:38:37.005588 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2p669"] Dec 02 18:38:37 crc kubenswrapper[4878]: I1202 18:38:37.138791 4878 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod1ac2d806-c2e2-4d10-9975-e4ae079add40"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod1ac2d806-c2e2-4d10-9975-e4ae079add40] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1ac2d806_c2e2_4d10_9975_e4ae079add40.slice" Dec 02 18:38:38 crc kubenswrapper[4878]: I1202 18:38:38.573540 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2p669" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="registry-server" containerID="cri-o://36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e" gracePeriod=2 Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.208822 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.383744 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-catalog-content\") pod \"c17b3e4f-ede8-45e3-86df-c2a7813744de\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.383875 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-utilities\") pod \"c17b3e4f-ede8-45e3-86df-c2a7813744de\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.383899 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qxtn\" (UniqueName: \"kubernetes.io/projected/c17b3e4f-ede8-45e3-86df-c2a7813744de-kube-api-access-4qxtn\") pod \"c17b3e4f-ede8-45e3-86df-c2a7813744de\" (UID: \"c17b3e4f-ede8-45e3-86df-c2a7813744de\") " Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.384798 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-utilities" (OuterVolumeSpecName: "utilities") pod "c17b3e4f-ede8-45e3-86df-c2a7813744de" (UID: "c17b3e4f-ede8-45e3-86df-c2a7813744de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.408536 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17b3e4f-ede8-45e3-86df-c2a7813744de-kube-api-access-4qxtn" (OuterVolumeSpecName: "kube-api-access-4qxtn") pod "c17b3e4f-ede8-45e3-86df-c2a7813744de" (UID: "c17b3e4f-ede8-45e3-86df-c2a7813744de"). InnerVolumeSpecName "kube-api-access-4qxtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.487685 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.487740 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qxtn\" (UniqueName: \"kubernetes.io/projected/c17b3e4f-ede8-45e3-86df-c2a7813744de-kube-api-access-4qxtn\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.490676 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c17b3e4f-ede8-45e3-86df-c2a7813744de" (UID: "c17b3e4f-ede8-45e3-86df-c2a7813744de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.589947 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c17b3e4f-ede8-45e3-86df-c2a7813744de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.591972 4878 generic.go:334] "Generic (PLEG): container finished" podID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerID="36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e" exitCode=0 Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.592034 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p669" event={"ID":"c17b3e4f-ede8-45e3-86df-c2a7813744de","Type":"ContainerDied","Data":"36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e"} Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.592084 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p669" event={"ID":"c17b3e4f-ede8-45e3-86df-c2a7813744de","Type":"ContainerDied","Data":"606db30b5c1854302fc7b101c602b8ea300784fb76fb5986284da3c9359d0a5f"} Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.592097 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p669" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.592112 4878 scope.go:117] "RemoveContainer" containerID="36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.619228 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620627 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620663 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620681 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="registry-server" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620688 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="registry-server" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620702 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerName="init" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620708 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerName="init" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620719 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="extract-content" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620725 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="extract-content" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620740 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="extract-content" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620746 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="extract-content" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620757 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="extract-utilities" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620764 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="extract-utilities" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620779 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="registry-server" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620785 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="registry-server" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620793 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api-log" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620798 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api-log" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620809 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerName="dnsmasq-dns" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620816 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerName="dnsmasq-dns" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.620827 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="extract-utilities" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.620833 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="extract-utilities" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.621074 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api-log" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.621090 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aca3209-3fd0-41cb-8bf6-5fe729d0547a" containerName="dnsmasq-dns" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.621108 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c6c49a-fb99-4649-9a4e-a290070d77e7" containerName="barbican-api" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.621118 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" containerName="registry-server" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.621134 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4a8419-5172-47bf-a836-b88e734e919b" containerName="registry-server" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.637031 4878 scope.go:117] "RemoveContainer" containerID="22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.672061 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.680441 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2p669"] Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.672413 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.691825 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.692069 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2s9j9" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.692207 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.705581 4878 scope.go:117] "RemoveContainer" containerID="4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.712955 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2p669"] Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.762440 4878 scope.go:117] "RemoveContainer" containerID="36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.766526 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e\": container with ID starting with 36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e not found: ID does not exist" containerID="36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.773779 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e"} err="failed to get container status \"36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e\": rpc error: code = NotFound desc = could not find container \"36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e\": container with ID starting with 36b8fea47be3af25f707b736f0d4408fdf797adac7e0043bfc6e153d70cec08e not found: ID does not exist" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.773914 4878 scope.go:117] "RemoveContainer" containerID="22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.783965 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2\": container with ID starting with 22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2 not found: ID does not exist" containerID="22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.784049 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2"} err="failed to get container status \"22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2\": rpc error: code = NotFound desc = could not find container \"22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2\": container with ID starting with 22c645c8608a270f554bc3f53bfd56a3e4b406329ae18f60ed66704b9eb801d2 not found: ID does not exist" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.784107 4878 scope.go:117] "RemoveContainer" containerID="4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.784650 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6\": container with ID starting with 4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6 not found: ID does not exist" containerID="4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.784703 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6"} err="failed to get container status \"4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6\": rpc error: code = NotFound desc = could not find container \"4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6\": container with ID starting with 4a6649b3d4853299ef9df49a694086e1a715d8d5cfd757ab54ad73ab7ac08cf6 not found: ID does not exist" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.806360 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.806434 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.806506 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8clbt\" (UniqueName: \"kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.806532 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config-secret\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.851584 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.852878 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-8clbt openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="3eaf4f04-dc97-4e86-bc56-437da55a5cd1" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.866441 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.884260 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.886522 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.895413 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.909777 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.909880 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.909994 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8clbt\" (UniqueName: \"kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.910028 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config-secret\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.913341 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.915549 4878 projected.go:194] Error preparing data for projected volume kube-api-access-8clbt for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (3eaf4f04-dc97-4e86-bc56-437da55a5cd1) does not match the UID in record. The object might have been deleted and then recreated Dec 02 18:38:39 crc kubenswrapper[4878]: E1202 18:38:39.915631 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt podName:3eaf4f04-dc97-4e86-bc56-437da55a5cd1 nodeName:}" failed. No retries permitted until 2025-12-02 18:38:40.415608767 +0000 UTC m=+1430.105227648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8clbt" (UniqueName: "kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt") pod "openstackclient" (UID: "3eaf4f04-dc97-4e86-bc56-437da55a5cd1") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (3eaf4f04-dc97-4e86-bc56-437da55a5cd1) does not match the UID in record. The object might have been deleted and then recreated Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.926945 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config-secret\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:39 crc kubenswrapper[4878]: I1202 18:38:39.933429 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.014737 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.014847 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxpq\" (UniqueName: \"kubernetes.io/projected/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-kube-api-access-wsxpq\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.014878 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-openstack-config\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.014944 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-openstack-config-secret\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.019722 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.020390 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.107686 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-8k5pc"] Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.107993 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" podUID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerName="dnsmasq-dns" containerID="cri-o://024e0ef1c974f2d743fd83e5813cd9757bc33b2bd4060d4672fc7b8a9ecfc86a" gracePeriod=10 Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.117974 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.118082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxpq\" (UniqueName: \"kubernetes.io/projected/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-kube-api-access-wsxpq\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.118123 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-openstack-config\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.118224 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-openstack-config-secret\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.123665 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-openstack-config\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.127561 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-openstack-config-secret\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.137408 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.141455 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.146451 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxpq\" (UniqueName: \"kubernetes.io/projected/5caa14c8-5110-4246-a7d4-75ef3c6d5d00-kube-api-access-wsxpq\") pod \"openstackclient\" (UID: \"5caa14c8-5110-4246-a7d4-75ef3c6d5d00\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.213580 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.426077 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8clbt\" (UniqueName: \"kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt\") pod \"openstackclient\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: E1202 18:38:40.428888 4878 projected.go:194] Error preparing data for projected volume kube-api-access-8clbt for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (3eaf4f04-dc97-4e86-bc56-437da55a5cd1) does not match the UID in record. The object might have been deleted and then recreated Dec 02 18:38:40 crc kubenswrapper[4878]: E1202 18:38:40.428970 4878 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt podName:3eaf4f04-dc97-4e86-bc56-437da55a5cd1 nodeName:}" failed. No retries permitted until 2025-12-02 18:38:41.428951938 +0000 UTC m=+1431.118570819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8clbt" (UniqueName: "kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt") pod "openstackclient" (UID: "3eaf4f04-dc97-4e86-bc56-437da55a5cd1") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (3eaf4f04-dc97-4e86-bc56-437da55a5cd1) does not match the UID in record. The object might have been deleted and then recreated Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.615685 4878 generic.go:334] "Generic (PLEG): container finished" podID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerID="024e0ef1c974f2d743fd83e5813cd9757bc33b2bd4060d4672fc7b8a9ecfc86a" exitCode=0 Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.615748 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" event={"ID":"8efbd07a-9303-4d37-8aa6-f660a919d2fd","Type":"ContainerDied","Data":"024e0ef1c974f2d743fd83e5813cd9757bc33b2bd4060d4672fc7b8a9ecfc86a"} Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.615958 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="cinder-scheduler" containerID="cri-o://990a360ba1eba09e51600934b2eded7ec476f6fa9a887f2e5fc0158a8a29e396" gracePeriod=30 Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.615993 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.616060 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="probe" containerID="cri-o://88e6ae84eaca3525507e67dc0c2d7a57e8c4745c15067ab748fc6326b5ea0a53" gracePeriod=30 Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.624480 4878 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3eaf4f04-dc97-4e86-bc56-437da55a5cd1" podUID="5caa14c8-5110-4246-a7d4-75ef3c6d5d00" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.636922 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.662427 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.735145 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-combined-ca-bundle\") pod \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.735430 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config-secret\") pod \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.735846 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config\") pod \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\" (UID: \"3eaf4f04-dc97-4e86-bc56-437da55a5cd1\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.736381 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3eaf4f04-dc97-4e86-bc56-437da55a5cd1" (UID: "3eaf4f04-dc97-4e86-bc56-437da55a5cd1"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.737319 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.737355 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8clbt\" (UniqueName: \"kubernetes.io/projected/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-kube-api-access-8clbt\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.748422 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eaf4f04-dc97-4e86-bc56-437da55a5cd1" (UID: "3eaf4f04-dc97-4e86-bc56-437da55a5cd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.748523 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3eaf4f04-dc97-4e86-bc56-437da55a5cd1" (UID: "3eaf4f04-dc97-4e86-bc56-437da55a5cd1"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.838371 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-dns-svc\") pod \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.838442 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-sb\") pod \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.838753 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-nb\") pod \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.838842 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-config\") pod \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.839051 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdq6d\" (UniqueName: \"kubernetes.io/projected/8efbd07a-9303-4d37-8aa6-f660a919d2fd-kube-api-access-xdq6d\") pod \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\" (UID: \"8efbd07a-9303-4d37-8aa6-f660a919d2fd\") " Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.840077 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.840114 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3eaf4f04-dc97-4e86-bc56-437da55a5cd1-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.860011 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efbd07a-9303-4d37-8aa6-f660a919d2fd-kube-api-access-xdq6d" (OuterVolumeSpecName: "kube-api-access-xdq6d") pod "8efbd07a-9303-4d37-8aa6-f660a919d2fd" (UID: "8efbd07a-9303-4d37-8aa6-f660a919d2fd"). InnerVolumeSpecName "kube-api-access-xdq6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.911771 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8efbd07a-9303-4d37-8aa6-f660a919d2fd" (UID: "8efbd07a-9303-4d37-8aa6-f660a919d2fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:40 crc kubenswrapper[4878]: W1202 18:38:40.916687 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5caa14c8_5110_4246_a7d4_75ef3c6d5d00.slice/crio-1687494bf6e347592eee9a425a4c14e50f0d2ac73448f4cad4cf3bbc4d634a6e WatchSource:0}: Error finding container 1687494bf6e347592eee9a425a4c14e50f0d2ac73448f4cad4cf3bbc4d634a6e: Status 404 returned error can't find the container with id 1687494bf6e347592eee9a425a4c14e50f0d2ac73448f4cad4cf3bbc4d634a6e Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.916717 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-config" (OuterVolumeSpecName: "config") pod "8efbd07a-9303-4d37-8aa6-f660a919d2fd" (UID: "8efbd07a-9303-4d37-8aa6-f660a919d2fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.921528 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8efbd07a-9303-4d37-8aa6-f660a919d2fd" (UID: "8efbd07a-9303-4d37-8aa6-f660a919d2fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.931744 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.942790 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.943301 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.943316 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdq6d\" (UniqueName: \"kubernetes.io/projected/8efbd07a-9303-4d37-8aa6-f660a919d2fd-kube-api-access-xdq6d\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.943326 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.960691 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eaf4f04-dc97-4e86-bc56-437da55a5cd1" path="/var/lib/kubelet/pods/3eaf4f04-dc97-4e86-bc56-437da55a5cd1/volumes" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.962301 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17b3e4f-ede8-45e3-86df-c2a7813744de" path="/var/lib/kubelet/pods/c17b3e4f-ede8-45e3-86df-c2a7813744de/volumes" Dec 02 18:38:40 crc kubenswrapper[4878]: I1202 18:38:40.992638 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8efbd07a-9303-4d37-8aa6-f660a919d2fd" (UID: "8efbd07a-9303-4d37-8aa6-f660a919d2fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.045603 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efbd07a-9303-4d37-8aa6-f660a919d2fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.632499 4878 generic.go:334] "Generic (PLEG): container finished" podID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerID="88e6ae84eaca3525507e67dc0c2d7a57e8c4745c15067ab748fc6326b5ea0a53" exitCode=0 Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.632549 4878 generic.go:334] "Generic (PLEG): container finished" podID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerID="990a360ba1eba09e51600934b2eded7ec476f6fa9a887f2e5fc0158a8a29e396" exitCode=0 Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.632648 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b3db6e3-7437-4bea-8d93-53b8586cac40","Type":"ContainerDied","Data":"88e6ae84eaca3525507e67dc0c2d7a57e8c4745c15067ab748fc6326b5ea0a53"} Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.632688 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b3db6e3-7437-4bea-8d93-53b8586cac40","Type":"ContainerDied","Data":"990a360ba1eba09e51600934b2eded7ec476f6fa9a887f2e5fc0158a8a29e396"} Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.634677 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5caa14c8-5110-4246-a7d4-75ef3c6d5d00","Type":"ContainerStarted","Data":"1687494bf6e347592eee9a425a4c14e50f0d2ac73448f4cad4cf3bbc4d634a6e"} Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.638538 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.640408 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" event={"ID":"8efbd07a-9303-4d37-8aa6-f660a919d2fd","Type":"ContainerDied","Data":"98ee5722fc8ebe45fac8f7bc9e540f68be347f67bbd311241af2f393676579db"} Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.640505 4878 scope.go:117] "RemoveContainer" containerID="024e0ef1c974f2d743fd83e5813cd9757bc33b2bd4060d4672fc7b8a9ecfc86a" Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.640440 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d649d8c65-8k5pc" Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.649209 4878 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3eaf4f04-dc97-4e86-bc56-437da55a5cd1" podUID="5caa14c8-5110-4246-a7d4-75ef3c6d5d00" Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.688260 4878 scope.go:117] "RemoveContainer" containerID="4d91e62096749d9285edb5e212deb8cd1e8db4b0d47782e31dad560146ba8b07" Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.695696 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-8k5pc"] Dec 02 18:38:41 crc kubenswrapper[4878]: I1202 18:38:41.708077 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d649d8c65-8k5pc"] Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.033206 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.177280 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-combined-ca-bundle\") pod \"7b3db6e3-7437-4bea-8d93-53b8586cac40\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.177372 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data\") pod \"7b3db6e3-7437-4bea-8d93-53b8586cac40\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.177501 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b3db6e3-7437-4bea-8d93-53b8586cac40-etc-machine-id\") pod \"7b3db6e3-7437-4bea-8d93-53b8586cac40\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.177525 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzt86\" (UniqueName: \"kubernetes.io/projected/7b3db6e3-7437-4bea-8d93-53b8586cac40-kube-api-access-nzt86\") pod \"7b3db6e3-7437-4bea-8d93-53b8586cac40\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.177644 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-scripts\") pod \"7b3db6e3-7437-4bea-8d93-53b8586cac40\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.177695 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b3db6e3-7437-4bea-8d93-53b8586cac40-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7b3db6e3-7437-4bea-8d93-53b8586cac40" (UID: "7b3db6e3-7437-4bea-8d93-53b8586cac40"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.177756 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data-custom\") pod \"7b3db6e3-7437-4bea-8d93-53b8586cac40\" (UID: \"7b3db6e3-7437-4bea-8d93-53b8586cac40\") " Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.178304 4878 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b3db6e3-7437-4bea-8d93-53b8586cac40-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.186405 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b3db6e3-7437-4bea-8d93-53b8586cac40" (UID: "7b3db6e3-7437-4bea-8d93-53b8586cac40"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.186546 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3db6e3-7437-4bea-8d93-53b8586cac40-kube-api-access-nzt86" (OuterVolumeSpecName: "kube-api-access-nzt86") pod "7b3db6e3-7437-4bea-8d93-53b8586cac40" (UID: "7b3db6e3-7437-4bea-8d93-53b8586cac40"). InnerVolumeSpecName "kube-api-access-nzt86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.237426 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-scripts" (OuterVolumeSpecName: "scripts") pod "7b3db6e3-7437-4bea-8d93-53b8586cac40" (UID: "7b3db6e3-7437-4bea-8d93-53b8586cac40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.280154 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzt86\" (UniqueName: \"kubernetes.io/projected/7b3db6e3-7437-4bea-8d93-53b8586cac40-kube-api-access-nzt86\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.280191 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.280201 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.294516 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b3db6e3-7437-4bea-8d93-53b8586cac40" (UID: "7b3db6e3-7437-4bea-8d93-53b8586cac40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.354090 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data" (OuterVolumeSpecName: "config-data") pod "7b3db6e3-7437-4bea-8d93-53b8586cac40" (UID: "7b3db6e3-7437-4bea-8d93-53b8586cac40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.382264 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.382318 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b3db6e3-7437-4bea-8d93-53b8586cac40-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.666481 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b3db6e3-7437-4bea-8d93-53b8586cac40","Type":"ContainerDied","Data":"5f3bd674799ea7de6404941d34582f022c70c3700ba1a1136c521431666b54f0"} Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.666587 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.666866 4878 scope.go:117] "RemoveContainer" containerID="88e6ae84eaca3525507e67dc0c2d7a57e8c4745c15067ab748fc6326b5ea0a53" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.700086 4878 scope.go:117] "RemoveContainer" containerID="990a360ba1eba09e51600934b2eded7ec476f6fa9a887f2e5fc0158a8a29e396" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.703702 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.718094 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.742381 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:42 crc kubenswrapper[4878]: E1202 18:38:42.743104 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="cinder-scheduler" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.743135 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="cinder-scheduler" Dec 02 18:38:42 crc kubenswrapper[4878]: E1202 18:38:42.743172 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="probe" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.743181 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="probe" Dec 02 18:38:42 crc kubenswrapper[4878]: E1202 18:38:42.743210 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerName="dnsmasq-dns" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.743219 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerName="dnsmasq-dns" Dec 02 18:38:42 crc kubenswrapper[4878]: E1202 18:38:42.743272 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerName="init" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.743282 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerName="init" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.743579 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" containerName="dnsmasq-dns" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.743611 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="cinder-scheduler" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.743638 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" containerName="probe" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.745283 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.749052 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.758073 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.856475 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.893900 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn82v\" (UniqueName: \"kubernetes.io/projected/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-kube-api-access-cn82v\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.893965 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.894011 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.894157 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.894261 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.894284 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.954924 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3db6e3-7437-4bea-8d93-53b8586cac40" path="/var/lib/kubelet/pods/7b3db6e3-7437-4bea-8d93-53b8586cac40/volumes" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.956373 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8efbd07a-9303-4d37-8aa6-f660a919d2fd" path="/var/lib/kubelet/pods/8efbd07a-9303-4d37-8aa6-f660a919d2fd/volumes" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.996186 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.996231 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.996370 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn82v\" (UniqueName: \"kubernetes.io/projected/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-kube-api-access-cn82v\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.996400 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.996428 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.996533 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:42 crc kubenswrapper[4878]: I1202 18:38:42.997030 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:43 crc kubenswrapper[4878]: I1202 18:38:43.009931 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:43 crc kubenswrapper[4878]: I1202 18:38:43.010146 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:43 crc kubenswrapper[4878]: I1202 18:38:43.014909 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:43 crc kubenswrapper[4878]: I1202 18:38:43.031121 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:43 crc kubenswrapper[4878]: I1202 18:38:43.057393 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn82v\" (UniqueName: \"kubernetes.io/projected/5cea7d1e-f0d6-4a27-9840-1ce77743b26d-kube-api-access-cn82v\") pod \"cinder-scheduler-0\" (UID: \"5cea7d1e-f0d6-4a27-9840-1ce77743b26d\") " pod="openstack/cinder-scheduler-0" Dec 02 18:38:43 crc kubenswrapper[4878]: I1202 18:38:43.126568 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 18:38:43 crc kubenswrapper[4878]: W1202 18:38:43.912328 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cea7d1e_f0d6_4a27_9840_1ce77743b26d.slice/crio-b2a097a53a81318954bc1c885b02af5ebb765d51defb1a6966ccdf2c1171aac9 WatchSource:0}: Error finding container b2a097a53a81318954bc1c885b02af5ebb765d51defb1a6966ccdf2c1171aac9: Status 404 returned error can't find the container with id b2a097a53a81318954bc1c885b02af5ebb765d51defb1a6966ccdf2c1171aac9 Dec 02 18:38:43 crc kubenswrapper[4878]: I1202 18:38:43.914362 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 18:38:44 crc kubenswrapper[4878]: I1202 18:38:44.707166 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cea7d1e-f0d6-4a27-9840-1ce77743b26d","Type":"ContainerStarted","Data":"3791211dc9e92642c7ae43aafd749752b1a16e434bffc630e1cdc0b51c2f5c09"} Dec 02 18:38:44 crc kubenswrapper[4878]: I1202 18:38:44.707833 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cea7d1e-f0d6-4a27-9840-1ce77743b26d","Type":"ContainerStarted","Data":"b2a097a53a81318954bc1c885b02af5ebb765d51defb1a6966ccdf2c1171aac9"} Dec 02 18:38:45 crc kubenswrapper[4878]: I1202 18:38:45.723727 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cea7d1e-f0d6-4a27-9840-1ce77743b26d","Type":"ContainerStarted","Data":"56aa610e307e83657ad50f929269bb91ce6341a3ae819d9d3d804dd3185aebcf"} Dec 02 18:38:45 crc kubenswrapper[4878]: I1202 18:38:45.757877 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.757848417 podStartE2EDuration="3.757848417s" podCreationTimestamp="2025-12-02 18:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:45.748086111 +0000 UTC m=+1435.437704992" watchObservedRunningTime="2025-12-02 18:38:45.757848417 +0000 UTC m=+1435.447467298" Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.114772 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.115531 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-central-agent" containerID="cri-o://cba5c992af90a860edb1e71507dee2592ba3ec291c6acf5af10c0d5d056f5420" gracePeriod=30 Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.116584 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="proxy-httpd" containerID="cri-o://2549639c96776d6df03159db76377922bf774a26eef4efaeb13efbf42de94983" gracePeriod=30 Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.116671 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="sg-core" containerID="cri-o://5bb8357ad01c4521f791b9314dca2054ded3eb44444c45f02438ff7a1cc618e9" gracePeriod=30 Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.116724 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-notification-agent" containerID="cri-o://ffa5d6769d2f8e9ae8403272366b710f29b469605473dc2b96d03758fbf74656" gracePeriod=30 Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.137682 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.199:3000/\": EOF" Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.740727 4878 generic.go:334] "Generic (PLEG): container finished" podID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerID="2549639c96776d6df03159db76377922bf774a26eef4efaeb13efbf42de94983" exitCode=0 Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.740759 4878 generic.go:334] "Generic (PLEG): container finished" podID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerID="5bb8357ad01c4521f791b9314dca2054ded3eb44444c45f02438ff7a1cc618e9" exitCode=2 Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.741937 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerDied","Data":"2549639c96776d6df03159db76377922bf774a26eef4efaeb13efbf42de94983"} Dec 02 18:38:46 crc kubenswrapper[4878]: I1202 18:38:46.741973 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerDied","Data":"5bb8357ad01c4521f791b9314dca2054ded3eb44444c45f02438ff7a1cc618e9"} Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.383745 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-54fd7cfcc9-x4n56"] Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.387139 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.389726 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.389935 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.391537 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.397798 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-54fd7cfcc9-x4n56"] Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.528972 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c033144d-0cad-47bd-87b6-3715278cf5c1-etc-swift\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.530293 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-public-tls-certs\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.530395 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf46r\" (UniqueName: \"kubernetes.io/projected/c033144d-0cad-47bd-87b6-3715278cf5c1-kube-api-access-hf46r\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.530465 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-config-data\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.530613 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c033144d-0cad-47bd-87b6-3715278cf5c1-log-httpd\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.530733 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-internal-tls-certs\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.530814 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c033144d-0cad-47bd-87b6-3715278cf5c1-run-httpd\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.530925 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-combined-ca-bundle\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633419 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c033144d-0cad-47bd-87b6-3715278cf5c1-log-httpd\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633492 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-internal-tls-certs\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633520 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c033144d-0cad-47bd-87b6-3715278cf5c1-run-httpd\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633575 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-combined-ca-bundle\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633642 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c033144d-0cad-47bd-87b6-3715278cf5c1-etc-swift\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633718 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-public-tls-certs\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633741 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf46r\" (UniqueName: \"kubernetes.io/projected/c033144d-0cad-47bd-87b6-3715278cf5c1-kube-api-access-hf46r\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.633760 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-config-data\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.634125 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c033144d-0cad-47bd-87b6-3715278cf5c1-log-httpd\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.634713 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c033144d-0cad-47bd-87b6-3715278cf5c1-run-httpd\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.640530 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c033144d-0cad-47bd-87b6-3715278cf5c1-etc-swift\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.646975 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-internal-tls-certs\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.647020 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-config-data\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.655990 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-public-tls-certs\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.658885 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf46r\" (UniqueName: \"kubernetes.io/projected/c033144d-0cad-47bd-87b6-3715278cf5c1-kube-api-access-hf46r\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.667540 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c033144d-0cad-47bd-87b6-3715278cf5c1-combined-ca-bundle\") pod \"swift-proxy-54fd7cfcc9-x4n56\" (UID: \"c033144d-0cad-47bd-87b6-3715278cf5c1\") " pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.717287 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.777948 4878 generic.go:334] "Generic (PLEG): container finished" podID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerID="cba5c992af90a860edb1e71507dee2592ba3ec291c6acf5af10c0d5d056f5420" exitCode=0 Dec 02 18:38:47 crc kubenswrapper[4878]: I1202 18:38:47.777989 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerDied","Data":"cba5c992af90a860edb1e71507dee2592ba3ec291c6acf5af10c0d5d056f5420"} Dec 02 18:38:48 crc kubenswrapper[4878]: I1202 18:38:48.126749 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 18:38:49 crc kubenswrapper[4878]: I1202 18:38:49.805843 4878 generic.go:334] "Generic (PLEG): container finished" podID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerID="ffa5d6769d2f8e9ae8403272366b710f29b469605473dc2b96d03758fbf74656" exitCode=0 Dec 02 18:38:49 crc kubenswrapper[4878]: I1202 18:38:49.806321 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerDied","Data":"ffa5d6769d2f8e9ae8403272366b710f29b469605473dc2b96d03758fbf74656"} Dec 02 18:38:50 crc kubenswrapper[4878]: I1202 18:38:50.020337 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:50 crc kubenswrapper[4878]: I1202 18:38:50.020823 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-log" containerID="cri-o://d41dad766c59745a147f4023eefa4a896098b8792ec0eb5d76efa27e7f4db6e3" gracePeriod=30 Dec 02 18:38:50 crc kubenswrapper[4878]: I1202 18:38:50.020881 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-httpd" containerID="cri-o://3719d30fd4b727fd8781ab06800fe1878b068f02080b2ae83116f881d227dc6d" gracePeriod=30 Dec 02 18:38:50 crc kubenswrapper[4878]: I1202 18:38:50.833614 4878 generic.go:334] "Generic (PLEG): container finished" podID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerID="d41dad766c59745a147f4023eefa4a896098b8792ec0eb5d76efa27e7f4db6e3" exitCode=143 Dec 02 18:38:50 crc kubenswrapper[4878]: I1202 18:38:50.833656 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4cdcc03-9890-4704-a31b-e8f8858140a5","Type":"ContainerDied","Data":"d41dad766c59745a147f4023eefa4a896098b8792ec0eb5d76efa27e7f4db6e3"} Dec 02 18:38:52 crc kubenswrapper[4878]: I1202 18:38:52.019831 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:52 crc kubenswrapper[4878]: I1202 18:38:52.020570 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-log" containerID="cri-o://3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c" gracePeriod=30 Dec 02 18:38:52 crc kubenswrapper[4878]: I1202 18:38:52.020684 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-httpd" containerID="cri-o://2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0" gracePeriod=30 Dec 02 18:38:52 crc kubenswrapper[4878]: I1202 18:38:52.076635 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.199:3000/\": dial tcp 10.217.0.199:3000: connect: connection refused" Dec 02 18:38:52 crc kubenswrapper[4878]: I1202 18:38:52.875556 4878 generic.go:334] "Generic (PLEG): container finished" podID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerID="3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c" exitCode=143 Dec 02 18:38:52 crc kubenswrapper[4878]: I1202 18:38:52.875617 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"213894fc-d7f0-4fd6-9c48-83b91a9b7872","Type":"ContainerDied","Data":"3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c"} Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.346589 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.518177 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-scripts\") pod \"381b845a-80e0-4848-a5b6-f125d9d0cc60\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.518684 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfc8\" (UniqueName: \"kubernetes.io/projected/381b845a-80e0-4848-a5b6-f125d9d0cc60-kube-api-access-pkfc8\") pod \"381b845a-80e0-4848-a5b6-f125d9d0cc60\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.518703 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-config-data\") pod \"381b845a-80e0-4848-a5b6-f125d9d0cc60\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.518739 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-log-httpd\") pod \"381b845a-80e0-4848-a5b6-f125d9d0cc60\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.518767 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-sg-core-conf-yaml\") pod \"381b845a-80e0-4848-a5b6-f125d9d0cc60\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.518783 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-combined-ca-bundle\") pod \"381b845a-80e0-4848-a5b6-f125d9d0cc60\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.518843 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-run-httpd\") pod \"381b845a-80e0-4848-a5b6-f125d9d0cc60\" (UID: \"381b845a-80e0-4848-a5b6-f125d9d0cc60\") " Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.520089 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "381b845a-80e0-4848-a5b6-f125d9d0cc60" (UID: "381b845a-80e0-4848-a5b6-f125d9d0cc60"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.525154 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "381b845a-80e0-4848-a5b6-f125d9d0cc60" (UID: "381b845a-80e0-4848-a5b6-f125d9d0cc60"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.527224 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-scripts" (OuterVolumeSpecName: "scripts") pod "381b845a-80e0-4848-a5b6-f125d9d0cc60" (UID: "381b845a-80e0-4848-a5b6-f125d9d0cc60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.539620 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381b845a-80e0-4848-a5b6-f125d9d0cc60-kube-api-access-pkfc8" (OuterVolumeSpecName: "kube-api-access-pkfc8") pod "381b845a-80e0-4848-a5b6-f125d9d0cc60" (UID: "381b845a-80e0-4848-a5b6-f125d9d0cc60"). InnerVolumeSpecName "kube-api-access-pkfc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.602643 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "381b845a-80e0-4848-a5b6-f125d9d0cc60" (UID: "381b845a-80e0-4848-a5b6-f125d9d0cc60"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.614704 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.621905 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.621943 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.623393 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfc8\" (UniqueName: \"kubernetes.io/projected/381b845a-80e0-4848-a5b6-f125d9d0cc60-kube-api-access-pkfc8\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.623411 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/381b845a-80e0-4848-a5b6-f125d9d0cc60-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.623430 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.657663 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "381b845a-80e0-4848-a5b6-f125d9d0cc60" (UID: "381b845a-80e0-4848-a5b6-f125d9d0cc60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.706264 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-config-data" (OuterVolumeSpecName: "config-data") pod "381b845a-80e0-4848-a5b6-f125d9d0cc60" (UID: "381b845a-80e0-4848-a5b6-f125d9d0cc60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.715046 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-54fd7cfcc9-x4n56"] Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.726180 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.726212 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381b845a-80e0-4848-a5b6-f125d9d0cc60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.741981 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.742035 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.919749 4878 generic.go:334] "Generic (PLEG): container finished" podID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerID="3719d30fd4b727fd8781ab06800fe1878b068f02080b2ae83116f881d227dc6d" exitCode=0 Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.919859 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4cdcc03-9890-4704-a31b-e8f8858140a5","Type":"ContainerDied","Data":"3719d30fd4b727fd8781ab06800fe1878b068f02080b2ae83116f881d227dc6d"} Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.920324 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4cdcc03-9890-4704-a31b-e8f8858140a5","Type":"ContainerDied","Data":"d95c43b1a41fd9c3d82e36574b76c9d86153286d8ab56ef929caec39788920a5"} Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.920359 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d95c43b1a41fd9c3d82e36574b76c9d86153286d8ab56ef929caec39788920a5" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.922399 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" event={"ID":"c033144d-0cad-47bd-87b6-3715278cf5c1","Type":"ContainerStarted","Data":"c1785a1234b71cb032130e38368d2d1aac9d2126be1ed77177ca50d76bc86307"} Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.923906 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5caa14c8-5110-4246-a7d4-75ef3c6d5d00","Type":"ContainerStarted","Data":"3aea77d77b8ec0f1319badd90ec389ecc2db56eec908dbb7df44ebe00b83d67a"} Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.928275 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.928143 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"381b845a-80e0-4848-a5b6-f125d9d0cc60","Type":"ContainerDied","Data":"6c28dfa458319d251cadfd33c4b01d054ce2ffee1e8f055d419ea725107cee9a"} Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.938494 4878 scope.go:117] "RemoveContainer" containerID="2549639c96776d6df03159db76377922bf774a26eef4efaeb13efbf42de94983" Dec 02 18:38:53 crc kubenswrapper[4878]: I1202 18:38:53.953360 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.892628446 podStartE2EDuration="14.953338677s" podCreationTimestamp="2025-12-02 18:38:39 +0000 UTC" firstStartedPulling="2025-12-02 18:38:40.926120151 +0000 UTC m=+1430.615739032" lastFinishedPulling="2025-12-02 18:38:52.986830382 +0000 UTC m=+1442.676449263" observedRunningTime="2025-12-02 18:38:53.946949906 +0000 UTC m=+1443.636568787" watchObservedRunningTime="2025-12-02 18:38:53.953338677 +0000 UTC m=+1443.642957558" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.039245 4878 scope.go:117] "RemoveContainer" containerID="5bb8357ad01c4521f791b9314dca2054ded3eb44444c45f02438ff7a1cc618e9" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.040098 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.072310 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.078670 4878 scope.go:117] "RemoveContainer" containerID="ffa5d6769d2f8e9ae8403272366b710f29b469605473dc2b96d03758fbf74656" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.117428 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135309 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135429 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-scripts\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135523 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-httpd-run\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135787 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-public-tls-certs\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135838 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-combined-ca-bundle\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135909 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-logs\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135943 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7db5f\" (UniqueName: \"kubernetes.io/projected/e4cdcc03-9890-4704-a31b-e8f8858140a5-kube-api-access-7db5f\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.135992 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-config-data\") pod \"e4cdcc03-9890-4704-a31b-e8f8858140a5\" (UID: \"e4cdcc03-9890-4704-a31b-e8f8858140a5\") " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.136413 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.136985 4878 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.142119 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-logs" (OuterVolumeSpecName: "logs") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.152056 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.164607 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-scripts" (OuterVolumeSpecName: "scripts") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.166463 4878 scope.go:117] "RemoveContainer" containerID="cba5c992af90a860edb1e71507dee2592ba3ec291c6acf5af10c0d5d056f5420" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.166635 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:54 crc kubenswrapper[4878]: E1202 18:38:54.167189 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-notification-agent" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167245 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-notification-agent" Dec 02 18:38:54 crc kubenswrapper[4878]: E1202 18:38:54.167264 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="proxy-httpd" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167270 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="proxy-httpd" Dec 02 18:38:54 crc kubenswrapper[4878]: E1202 18:38:54.167290 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-log" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167295 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-log" Dec 02 18:38:54 crc kubenswrapper[4878]: E1202 18:38:54.167322 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-httpd" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167329 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-httpd" Dec 02 18:38:54 crc kubenswrapper[4878]: E1202 18:38:54.167353 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="sg-core" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167359 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="sg-core" Dec 02 18:38:54 crc kubenswrapper[4878]: E1202 18:38:54.167389 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-central-agent" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167395 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-central-agent" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167602 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-httpd" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167621 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="sg-core" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167636 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="proxy-httpd" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167648 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" containerName="glance-log" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167661 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-notification-agent" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.167672 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" containerName="ceilometer-central-agent" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.183864 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.194417 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4cdcc03-9890-4704-a31b-e8f8858140a5-kube-api-access-7db5f" (OuterVolumeSpecName: "kube-api-access-7db5f") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "kube-api-access-7db5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.205634 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.205905 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239011 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239399 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-log-httpd\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239437 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-config-data\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239493 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-scripts\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239550 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-run-httpd\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239580 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239600 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86xg\" (UniqueName: \"kubernetes.io/projected/6ea05342-9ff1-4890-91c2-8ca68502816d-kube-api-access-k86xg\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239661 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4cdcc03-9890-4704-a31b-e8f8858140a5-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239671 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7db5f\" (UniqueName: \"kubernetes.io/projected/e4cdcc03-9890-4704-a31b-e8f8858140a5-kube-api-access-7db5f\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239693 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.239702 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.276569 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.302328 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.341579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-run-httpd\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.341667 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.341697 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k86xg\" (UniqueName: \"kubernetes.io/projected/6ea05342-9ff1-4890-91c2-8ca68502816d-kube-api-access-k86xg\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.341902 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.341982 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-log-httpd\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.342023 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-config-data\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.342143 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-scripts\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.342271 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.342403 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-run-httpd\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.343539 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-log-httpd\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.369693 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.370389 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.370833 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-scripts\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.387095 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.388881 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86xg\" (UniqueName: \"kubernetes.io/projected/6ea05342-9ff1-4890-91c2-8ca68502816d-kube-api-access-k86xg\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.390092 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-config-data\") pod \"ceilometer-0\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.413265 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.445637 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.445673 4878 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.518394 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-config-data" (OuterVolumeSpecName: "config-data") pod "e4cdcc03-9890-4704-a31b-e8f8858140a5" (UID: "e4cdcc03-9890-4704-a31b-e8f8858140a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.554435 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4cdcc03-9890-4704-a31b-e8f8858140a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.581019 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.582159 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.948822 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.956209 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381b845a-80e0-4848-a5b6-f125d9d0cc60" path="/var/lib/kubelet/pods/381b845a-80e0-4848-a5b6-f125d9d0cc60/volumes" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.958323 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.958349 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.958363 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" event={"ID":"c033144d-0cad-47bd-87b6-3715278cf5c1","Type":"ContainerStarted","Data":"03835b7f90626e28e6f732089eb738e22178688bbe07b9fb548fa7f4fef28393"} Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.958379 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" event={"ID":"c033144d-0cad-47bd-87b6-3715278cf5c1","Type":"ContainerStarted","Data":"f3086f9e30f701e3542b512f3c73badb26d7b318013a42dd64b9c63834e434fa"} Dec 02 18:38:54 crc kubenswrapper[4878]: I1202 18:38:54.992131 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" podStartSLOduration=7.992109477 podStartE2EDuration="7.992109477s" podCreationTimestamp="2025-12-02 18:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:54.980662428 +0000 UTC m=+1444.670281329" watchObservedRunningTime="2025-12-02 18:38:54.992109477 +0000 UTC m=+1444.681728358" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.027217 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.057288 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.078189 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.080899 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.084446 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.084482 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.096849 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.176460 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.277591 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2l98\" (UniqueName: \"kubernetes.io/projected/dc42b137-3b4d-4673-8f42-e1fd55534c16-kube-api-access-g2l98\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.277751 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.277795 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.277889 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.278004 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.278404 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.278450 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc42b137-3b4d-4673-8f42-e1fd55534c16-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.278549 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc42b137-3b4d-4673-8f42-e1fd55534c16-logs\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.380928 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2l98\" (UniqueName: \"kubernetes.io/projected/dc42b137-3b4d-4673-8f42-e1fd55534c16-kube-api-access-g2l98\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.381014 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.381040 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.381098 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.381169 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.382346 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc42b137-3b4d-4673-8f42-e1fd55534c16-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.382592 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.382638 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc42b137-3b4d-4673-8f42-e1fd55534c16-logs\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.382899 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc42b137-3b4d-4673-8f42-e1fd55534c16-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.383290 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.402914 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.402977 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.403198 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc42b137-3b4d-4673-8f42-e1fd55534c16-logs\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.403656 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.414290 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc42b137-3b4d-4673-8f42-e1fd55534c16-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.429776 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2l98\" (UniqueName: \"kubernetes.io/projected/dc42b137-3b4d-4673-8f42-e1fd55534c16-kube-api-access-g2l98\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.523025 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dc42b137-3b4d-4673-8f42-e1fd55534c16\") " pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.685691 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.717626 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.726604 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.726698 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.726731 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-config-data\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.726786 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-httpd-run\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.726854 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.726950 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-logs\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.726989 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgkv2\" (UniqueName: \"kubernetes.io/projected/213894fc-d7f0-4fd6-9c48-83b91a9b7872-kube-api-access-xgkv2\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.727017 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-scripts\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.730661 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.731720 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-logs" (OuterVolumeSpecName: "logs") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.738071 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.739534 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-scripts" (OuterVolumeSpecName: "scripts") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.745653 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/213894fc-d7f0-4fd6-9c48-83b91a9b7872-kube-api-access-xgkv2" (OuterVolumeSpecName: "kube-api-access-xgkv2") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "kube-api-access-xgkv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.828601 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-config-data" (OuterVolumeSpecName: "config-data") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.828962 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.828999 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.850111 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.850450 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs\") pod \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\" (UID: \"213894fc-d7f0-4fd6-9c48-83b91a9b7872\") " Dec 02 18:38:55 crc kubenswrapper[4878]: W1202 18:38:55.851381 4878 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/213894fc-d7f0-4fd6-9c48-83b91a9b7872/volumes/kubernetes.io~secret/combined-ca-bundle Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851416 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: W1202 18:38:55.851533 4878 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/213894fc-d7f0-4fd6-9c48-83b91a9b7872/volumes/kubernetes.io~secret/internal-tls-certs Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851558 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "213894fc-d7f0-4fd6-9c48-83b91a9b7872" (UID: "213894fc-d7f0-4fd6-9c48-83b91a9b7872"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851873 4878 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851910 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851924 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851939 4878 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851951 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851963 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/213894fc-d7f0-4fd6-9c48-83b91a9b7872-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851977 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgkv2\" (UniqueName: \"kubernetes.io/projected/213894fc-d7f0-4fd6-9c48-83b91a9b7872-kube-api-access-xgkv2\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.851990 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/213894fc-d7f0-4fd6-9c48-83b91a9b7872-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.913143 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.955026 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.980538 4878 generic.go:334] "Generic (PLEG): container finished" podID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerID="2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0" exitCode=0 Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.980633 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"213894fc-d7f0-4fd6-9c48-83b91a9b7872","Type":"ContainerDied","Data":"2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0"} Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.980673 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"213894fc-d7f0-4fd6-9c48-83b91a9b7872","Type":"ContainerDied","Data":"d672f4df5eb8415ce5abde765082b8f64ae7302ce3cdfbe036d952df66a342ae"} Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.980679 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.980699 4878 scope.go:117] "RemoveContainer" containerID="2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0" Dec 02 18:38:55 crc kubenswrapper[4878]: I1202 18:38:55.988276 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerStarted","Data":"ca3b6f0f509a25979f66c5dd3200a425488baecc8bf4095a3d269ff9af605df9"} Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.016506 4878 scope.go:117] "RemoveContainer" containerID="3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.054297 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.092632 4878 scope.go:117] "RemoveContainer" containerID="2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0" Dec 02 18:38:56 crc kubenswrapper[4878]: E1202 18:38:56.093799 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0\": container with ID starting with 2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0 not found: ID does not exist" containerID="2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.093921 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0"} err="failed to get container status \"2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0\": rpc error: code = NotFound desc = could not find container \"2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0\": container with ID starting with 2cc03287f80e41bc214a8dfa1431602a67e1e5b9e390c57def58e942b9732ed0 not found: ID does not exist" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.093965 4878 scope.go:117] "RemoveContainer" containerID="3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c" Dec 02 18:38:56 crc kubenswrapper[4878]: E1202 18:38:56.094511 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c\": container with ID starting with 3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c not found: ID does not exist" containerID="3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.094533 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c"} err="failed to get container status \"3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c\": rpc error: code = NotFound desc = could not find container \"3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c\": container with ID starting with 3e9fa0c715b7de0418b77e8b9910cebd2ba5173155f015d58b3ffe66bc448d5c not found: ID does not exist" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.100618 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.142621 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:56 crc kubenswrapper[4878]: E1202 18:38:56.143694 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-httpd" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.143723 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-httpd" Dec 02 18:38:56 crc kubenswrapper[4878]: E1202 18:38:56.143795 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-log" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.143805 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-log" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.144795 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-httpd" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.144821 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" containerName="glance-log" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.149093 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.154803 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.161420 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.161552 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270260 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46prq\" (UniqueName: \"kubernetes.io/projected/59f6212e-501f-4a58-8d24-8f79f95dc992-kube-api-access-46prq\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270387 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270426 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270488 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270556 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6212e-501f-4a58-8d24-8f79f95dc992-logs\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270657 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270711 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59f6212e-501f-4a58-8d24-8f79f95dc992-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.270887 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.373932 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.374449 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46prq\" (UniqueName: \"kubernetes.io/projected/59f6212e-501f-4a58-8d24-8f79f95dc992-kube-api-access-46prq\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.374587 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.374672 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.375067 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.387689 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.404948 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.405379 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6212e-501f-4a58-8d24-8f79f95dc992-logs\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.405649 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.405808 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59f6212e-501f-4a58-8d24-8f79f95dc992-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.406458 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59f6212e-501f-4a58-8d24-8f79f95dc992-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.407737 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-config-data\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.408100 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f6212e-501f-4a58-8d24-8f79f95dc992-logs\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.416949 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-scripts\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.421475 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46prq\" (UniqueName: \"kubernetes.io/projected/59f6212e-501f-4a58-8d24-8f79f95dc992-kube-api-access-46prq\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.428157 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f6212e-501f-4a58-8d24-8f79f95dc992-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.454647 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"59f6212e-501f-4a58-8d24-8f79f95dc992\") " pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.482065 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.783757 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.969006 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="213894fc-d7f0-4fd6-9c48-83b91a9b7872" path="/var/lib/kubelet/pods/213894fc-d7f0-4fd6-9c48-83b91a9b7872/volumes" Dec 02 18:38:56 crc kubenswrapper[4878]: I1202 18:38:56.970339 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4cdcc03-9890-4704-a31b-e8f8858140a5" path="/var/lib/kubelet/pods/e4cdcc03-9890-4704-a31b-e8f8858140a5/volumes" Dec 02 18:38:57 crc kubenswrapper[4878]: I1202 18:38:57.024182 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc42b137-3b4d-4673-8f42-e1fd55534c16","Type":"ContainerStarted","Data":"8f10bbd17f5f7bc6cba375b842886f4c279e998efef27c74a3f297386ebaba7f"} Dec 02 18:38:57 crc kubenswrapper[4878]: I1202 18:38:57.035873 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerStarted","Data":"1a5c679d7c4b99ed2be9d4c7b8d5ce4065411b807e1154c73d224619da6c4f36"} Dec 02 18:38:57 crc kubenswrapper[4878]: I1202 18:38:57.479611 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 18:38:58 crc kubenswrapper[4878]: I1202 18:38:58.080527 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc42b137-3b4d-4673-8f42-e1fd55534c16","Type":"ContainerStarted","Data":"b23b10247f5b240af2816ffa60934ecde9566ba245d7e9c3274181e4b6059f3e"} Dec 02 18:38:58 crc kubenswrapper[4878]: I1202 18:38:58.085326 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59f6212e-501f-4a58-8d24-8f79f95dc992","Type":"ContainerStarted","Data":"84ff2da350c274a0c0f2afb436078e6f41ca278fcdddf789a4cd6fe7730feb76"} Dec 02 18:38:58 crc kubenswrapper[4878]: I1202 18:38:58.091845 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerStarted","Data":"28035db2e638b03df0839122250e1679c0fb85eb591e3c7426c594ac0ba68a3a"} Dec 02 18:38:59 crc kubenswrapper[4878]: I1202 18:38:59.109662 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc42b137-3b4d-4673-8f42-e1fd55534c16","Type":"ContainerStarted","Data":"d9d15134ea68fd8951de278d5743821b138d0a1452f69602770f9b00b086b3d5"} Dec 02 18:38:59 crc kubenswrapper[4878]: I1202 18:38:59.112300 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59f6212e-501f-4a58-8d24-8f79f95dc992","Type":"ContainerStarted","Data":"617917436f50ec1e54d2b96f9c7a05918bb6fe3e2b32532e6973216260bc325e"} Dec 02 18:38:59 crc kubenswrapper[4878]: I1202 18:38:59.112366 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"59f6212e-501f-4a58-8d24-8f79f95dc992","Type":"ContainerStarted","Data":"78e572e73673374d0fe97e509b2fe4f9f85f44f9e03922f0ba8c77e4989b9fc7"} Dec 02 18:38:59 crc kubenswrapper[4878]: I1202 18:38:59.118040 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerStarted","Data":"9fa544bb0c693b2383d85f2095276439c45d6bd225440b4e7400cd6de265e2dc"} Dec 02 18:38:59 crc kubenswrapper[4878]: I1202 18:38:59.135946 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.135923507 podStartE2EDuration="4.135923507s" podCreationTimestamp="2025-12-02 18:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:59.1328184 +0000 UTC m=+1448.822437281" watchObservedRunningTime="2025-12-02 18:38:59.135923507 +0000 UTC m=+1448.825542388" Dec 02 18:38:59 crc kubenswrapper[4878]: I1202 18:38:59.160848 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.160821148 podStartE2EDuration="3.160821148s" podCreationTimestamp="2025-12-02 18:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:38:59.156903945 +0000 UTC m=+1448.846522836" watchObservedRunningTime="2025-12-02 18:38:59.160821148 +0000 UTC m=+1448.850440039" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.025646 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-64fc868fd9-8kp5j"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.028220 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.030072 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.030380 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-v925n" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.030622 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.063529 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-64fc868fd9-8kp5j"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.132484 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-combined-ca-bundle\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.132537 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data-custom\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.132587 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn9rh\" (UniqueName: \"kubernetes.io/projected/4e1991d0-7abb-495c-acb9-682829e20961-kube-api-access-fn9rh\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.132618 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.199883 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5db8f85467-fpfmd"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.202349 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.213378 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db8f85467-fpfmd"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.235515 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-combined-ca-bundle\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.235582 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data-custom\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.235640 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn9rh\" (UniqueName: \"kubernetes.io/projected/4e1991d0-7abb-495c-acb9-682829e20961-kube-api-access-fn9rh\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.235671 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.246593 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-68f754568d-brrc4"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.256918 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data-custom\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.257514 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.265857 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-combined-ca-bundle\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.272701 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.275334 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.280479 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn9rh\" (UniqueName: \"kubernetes.io/projected/4e1991d0-7abb-495c-acb9-682829e20961-kube-api-access-fn9rh\") pod \"heat-engine-64fc868fd9-8kp5j\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.281058 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68f754568d-brrc4"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.419532 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.419889 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw258\" (UniqueName: \"kubernetes.io/projected/843928c8-a35c-43b5-a1e6-88199ac743cc-kube-api-access-qw258\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.419956 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-swift-storage-0\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420021 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-svc\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420093 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420118 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420180 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-combined-ca-bundle\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420219 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5pt\" (UniqueName: \"kubernetes.io/projected/144b91a5-fb67-4975-948b-33fc4d95a6ff-kube-api-access-hc5pt\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420273 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420303 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-config\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.420394 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data-custom\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.449213 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-676cccd477-rncv9"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.454025 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.457572 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.463066 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-676cccd477-rncv9"] Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521333 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-config\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521666 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data-custom\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521714 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data-custom\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521755 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw258\" (UniqueName: \"kubernetes.io/projected/843928c8-a35c-43b5-a1e6-88199ac743cc-kube-api-access-qw258\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521787 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-swift-storage-0\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521826 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-combined-ca-bundle\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521853 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-svc\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521908 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521926 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521943 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh9bq\" (UniqueName: \"kubernetes.io/projected/725117cc-7ed9-47ca-aabe-1238f836ee93-kube-api-access-gh9bq\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.521991 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-combined-ca-bundle\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.522013 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.522039 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5pt\" (UniqueName: \"kubernetes.io/projected/144b91a5-fb67-4975-948b-33fc4d95a6ff-kube-api-access-hc5pt\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.522067 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.522643 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-config\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.523644 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-svc\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.523692 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-swift-storage-0\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.523817 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-sb\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.523852 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-nb\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.528204 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-combined-ca-bundle\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.529096 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data-custom\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.533318 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.547484 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw258\" (UniqueName: \"kubernetes.io/projected/843928c8-a35c-43b5-a1e6-88199ac743cc-kube-api-access-qw258\") pod \"dnsmasq-dns-5db8f85467-fpfmd\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.552214 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5pt\" (UniqueName: \"kubernetes.io/projected/144b91a5-fb67-4975-948b-33fc4d95a6ff-kube-api-access-hc5pt\") pod \"heat-cfnapi-68f754568d-brrc4\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.629041 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-combined-ca-bundle\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.629212 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh9bq\" (UniqueName: \"kubernetes.io/projected/725117cc-7ed9-47ca-aabe-1238f836ee93-kube-api-access-gh9bq\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.629299 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.629440 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data-custom\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.637099 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data-custom\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.639653 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.641920 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-combined-ca-bundle\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.659413 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh9bq\" (UniqueName: \"kubernetes.io/projected/725117cc-7ed9-47ca-aabe-1238f836ee93-kube-api-access-gh9bq\") pod \"heat-api-676cccd477-rncv9\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.814111 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.835496 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:00 crc kubenswrapper[4878]: I1202 18:39:00.847041 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.082129 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-64fc868fd9-8kp5j"] Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.196404 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerStarted","Data":"a05ef375f6e7ba1695de0b850ff826d5df295ad131a855ac7caeb2ef972fb5b1"} Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.196626 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-central-agent" containerID="cri-o://1a5c679d7c4b99ed2be9d4c7b8d5ce4065411b807e1154c73d224619da6c4f36" gracePeriod=30 Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.196977 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.197364 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="proxy-httpd" containerID="cri-o://a05ef375f6e7ba1695de0b850ff826d5df295ad131a855ac7caeb2ef972fb5b1" gracePeriod=30 Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.197413 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="sg-core" containerID="cri-o://9fa544bb0c693b2383d85f2095276439c45d6bd225440b4e7400cd6de265e2dc" gracePeriod=30 Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.197450 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-notification-agent" containerID="cri-o://28035db2e638b03df0839122250e1679c0fb85eb591e3c7426c594ac0ba68a3a" gracePeriod=30 Dec 02 18:39:01 crc kubenswrapper[4878]: I1202 18:39:01.253300 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.616120762 podStartE2EDuration="7.253274016s" podCreationTimestamp="2025-12-02 18:38:54 +0000 UTC" firstStartedPulling="2025-12-02 18:38:55.184655926 +0000 UTC m=+1444.874274807" lastFinishedPulling="2025-12-02 18:38:59.82180915 +0000 UTC m=+1449.511428061" observedRunningTime="2025-12-02 18:39:01.242730676 +0000 UTC m=+1450.932349547" watchObservedRunningTime="2025-12-02 18:39:01.253274016 +0000 UTC m=+1450.942892897" Dec 02 18:39:02 crc kubenswrapper[4878]: E1202 18:39:02.002321 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea05342_9ff1_4890_91c2_8ca68502816d.slice/crio-conmon-9fa544bb0c693b2383d85f2095276439c45d6bd225440b4e7400cd6de265e2dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea05342_9ff1_4890_91c2_8ca68502816d.slice/crio-28035db2e638b03df0839122250e1679c0fb85eb591e3c7426c594ac0ba68a3a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea05342_9ff1_4890_91c2_8ca68502816d.slice/crio-conmon-a05ef375f6e7ba1695de0b850ff826d5df295ad131a855ac7caeb2ef972fb5b1.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.286368 4878 generic.go:334] "Generic (PLEG): container finished" podID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerID="a05ef375f6e7ba1695de0b850ff826d5df295ad131a855ac7caeb2ef972fb5b1" exitCode=0 Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.286709 4878 generic.go:334] "Generic (PLEG): container finished" podID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerID="9fa544bb0c693b2383d85f2095276439c45d6bd225440b4e7400cd6de265e2dc" exitCode=2 Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.286723 4878 generic.go:334] "Generic (PLEG): container finished" podID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerID="28035db2e638b03df0839122250e1679c0fb85eb591e3c7426c594ac0ba68a3a" exitCode=0 Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.286785 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerDied","Data":"a05ef375f6e7ba1695de0b850ff826d5df295ad131a855ac7caeb2ef972fb5b1"} Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.286816 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerDied","Data":"9fa544bb0c693b2383d85f2095276439c45d6bd225440b4e7400cd6de265e2dc"} Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.286827 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerDied","Data":"28035db2e638b03df0839122250e1679c0fb85eb591e3c7426c594ac0ba68a3a"} Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.303625 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64fc868fd9-8kp5j" event={"ID":"4e1991d0-7abb-495c-acb9-682829e20961","Type":"ContainerStarted","Data":"64a981433cdd0e5af987ac096bb28a8a33f54c20087b11294364f4d1e74cd125"} Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.732358 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:39:02 crc kubenswrapper[4878]: I1202 18:39:02.745445 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-54fd7cfcc9-x4n56" Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.058611 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db8f85467-fpfmd"] Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.177596 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-676cccd477-rncv9"] Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.267893 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68f754568d-brrc4"] Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.357980 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64fc868fd9-8kp5j" event={"ID":"4e1991d0-7abb-495c-acb9-682829e20961","Type":"ContainerStarted","Data":"d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab"} Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.359676 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.368403 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" event={"ID":"843928c8-a35c-43b5-a1e6-88199ac743cc","Type":"ContainerStarted","Data":"7eda2f73d2784113e40295ab0f4cf641e0f8ab3a28da8ee57b34eeca9a5b053e"} Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.373764 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68f754568d-brrc4" event={"ID":"144b91a5-fb67-4975-948b-33fc4d95a6ff","Type":"ContainerStarted","Data":"cc2cb7744bb249051e7a61fb858efbb758d97c89dfb22b9a309282eebe180ac5"} Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.384669 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-676cccd477-rncv9" event={"ID":"725117cc-7ed9-47ca-aabe-1238f836ee93","Type":"ContainerStarted","Data":"37a13647711776e9f3a85654c7e75ecd3e6d946c826719d66f90a0673a305d32"} Dec 02 18:39:03 crc kubenswrapper[4878]: I1202 18:39:03.405268 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-64fc868fd9-8kp5j" podStartSLOduration=4.405230363 podStartE2EDuration="4.405230363s" podCreationTimestamp="2025-12-02 18:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:03.389018824 +0000 UTC m=+1453.078637705" watchObservedRunningTime="2025-12-02 18:39:03.405230363 +0000 UTC m=+1453.094849244" Dec 02 18:39:03 crc kubenswrapper[4878]: E1202 18:39:03.705058 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2d2b656_e51c_4de2_9dbc_1d8b6d316fa2.slice/crio-5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea05342_9ff1_4890_91c2_8ca68502816d.slice/crio-conmon-1a5c679d7c4b99ed2be9d4c7b8d5ce4065411b807e1154c73d224619da6c4f36.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea05342_9ff1_4890_91c2_8ca68502816d.slice/crio-1a5c679d7c4b99ed2be9d4c7b8d5ce4065411b807e1154c73d224619da6c4f36.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.502507 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.508569 4878 generic.go:334] "Generic (PLEG): container finished" podID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerID="1a5c679d7c4b99ed2be9d4c7b8d5ce4065411b807e1154c73d224619da6c4f36" exitCode=0 Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.508629 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerDied","Data":"1a5c679d7c4b99ed2be9d4c7b8d5ce4065411b807e1154c73d224619da6c4f36"} Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.508657 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea05342-9ff1-4890-91c2-8ca68502816d","Type":"ContainerDied","Data":"ca3b6f0f509a25979f66c5dd3200a425488baecc8bf4095a3d269ff9af605df9"} Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.508669 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca3b6f0f509a25979f66c5dd3200a425488baecc8bf4095a3d269ff9af605df9" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.514525 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.515076 4878 generic.go:334] "Generic (PLEG): container finished" podID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerID="5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de" exitCode=137 Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.515130 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2","Type":"ContainerDied","Data":"5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de"} Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.515152 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2","Type":"ContainerDied","Data":"79a7f02e3615c8aa5d12d13f35e60bd33dc9fe7fc65bec3de73dc4c1aa689470"} Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.515169 4878 scope.go:117] "RemoveContainer" containerID="5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.515320 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.538131 4878 generic.go:334] "Generic (PLEG): container finished" podID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerID="7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382" exitCode=0 Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.538417 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" event={"ID":"843928c8-a35c-43b5-a1e6-88199ac743cc","Type":"ContainerDied","Data":"7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382"} Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.642159 4878 scope.go:117] "RemoveContainer" containerID="012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681040 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data-custom\") pod \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681109 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-run-httpd\") pod \"6ea05342-9ff1-4890-91c2-8ca68502816d\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681151 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-scripts\") pod \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681213 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-config-data\") pod \"6ea05342-9ff1-4890-91c2-8ca68502816d\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681329 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k86xg\" (UniqueName: \"kubernetes.io/projected/6ea05342-9ff1-4890-91c2-8ca68502816d-kube-api-access-k86xg\") pod \"6ea05342-9ff1-4890-91c2-8ca68502816d\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681388 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data\") pod \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681403 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-combined-ca-bundle\") pod \"6ea05342-9ff1-4890-91c2-8ca68502816d\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681434 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-sg-core-conf-yaml\") pod \"6ea05342-9ff1-4890-91c2-8ca68502816d\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681654 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-scripts\") pod \"6ea05342-9ff1-4890-91c2-8ca68502816d\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681683 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-logs\") pod \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681718 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk75j\" (UniqueName: \"kubernetes.io/projected/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-kube-api-access-zk75j\") pod \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681748 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-combined-ca-bundle\") pod \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681770 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-etc-machine-id\") pod \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\" (UID: \"e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.681801 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-log-httpd\") pod \"6ea05342-9ff1-4890-91c2-8ca68502816d\" (UID: \"6ea05342-9ff1-4890-91c2-8ca68502816d\") " Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.687596 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" (UID: "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.688585 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ea05342-9ff1-4890-91c2-8ca68502816d" (UID: "6ea05342-9ff1-4890-91c2-8ca68502816d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.689088 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" (UID: "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.697223 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-logs" (OuterVolumeSpecName: "logs") pod "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" (UID: "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.703790 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ea05342-9ff1-4890-91c2-8ca68502816d" (UID: "6ea05342-9ff1-4890-91c2-8ca68502816d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.716003 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-scripts" (OuterVolumeSpecName: "scripts") pod "6ea05342-9ff1-4890-91c2-8ca68502816d" (UID: "6ea05342-9ff1-4890-91c2-8ca68502816d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.721521 4878 scope.go:117] "RemoveContainer" containerID="5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.722053 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea05342-9ff1-4890-91c2-8ca68502816d-kube-api-access-k86xg" (OuterVolumeSpecName: "kube-api-access-k86xg") pod "6ea05342-9ff1-4890-91c2-8ca68502816d" (UID: "6ea05342-9ff1-4890-91c2-8ca68502816d"). InnerVolumeSpecName "kube-api-access-k86xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.722821 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de\": container with ID starting with 5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de not found: ID does not exist" containerID="5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.722862 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de"} err="failed to get container status \"5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de\": rpc error: code = NotFound desc = could not find container \"5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de\": container with ID starting with 5cf7c9b881a9e1a08c29e24eed0a7e6cba03891b47943dc09a8d8e0ff333b1de not found: ID does not exist" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.722888 4878 scope.go:117] "RemoveContainer" containerID="012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b" Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.730605 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b\": container with ID starting with 012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b not found: ID does not exist" containerID="012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.730654 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b"} err="failed to get container status \"012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b\": rpc error: code = NotFound desc = could not find container \"012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b\": container with ID starting with 012af46cb4cf80ac01f3246be1ef951fca29ae8b43af2d1d0d615ba1ce18258b not found: ID does not exist" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.752475 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" (UID: "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.777367 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-kube-api-access-zk75j" (OuterVolumeSpecName: "kube-api-access-zk75j") pod "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" (UID: "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2"). InnerVolumeSpecName "kube-api-access-zk75j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.788872 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.788930 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.788944 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk75j\" (UniqueName: \"kubernetes.io/projected/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-kube-api-access-zk75j\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.788962 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.790775 4878 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.790804 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.790819 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.790834 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea05342-9ff1-4890-91c2-8ca68502816d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.790846 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k86xg\" (UniqueName: \"kubernetes.io/projected/6ea05342-9ff1-4890-91c2-8ca68502816d-kube-api-access-k86xg\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.793355 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-scripts" (OuterVolumeSpecName: "scripts") pod "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" (UID: "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.801008 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data" (OuterVolumeSpecName: "config-data") pod "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" (UID: "e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.840333 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ea05342-9ff1-4890-91c2-8ca68502816d" (UID: "6ea05342-9ff1-4890-91c2-8ca68502816d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.893580 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.893622 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.893637 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.898630 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.927358 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.931465 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ea05342-9ff1-4890-91c2-8ca68502816d" (UID: "6ea05342-9ff1-4890-91c2-8ca68502816d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.957969 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" path="/var/lib/kubelet/pods/e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2/volumes" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.962741 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.963327 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="sg-core" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963348 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="sg-core" Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.963368 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963375 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api" Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.963423 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="proxy-httpd" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963430 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="proxy-httpd" Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.963464 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-central-agent" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963508 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-central-agent" Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.963526 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api-log" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963532 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api-log" Dec 02 18:39:04 crc kubenswrapper[4878]: E1202 18:39:04.963575 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-notification-agent" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963583 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-notification-agent" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963908 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="proxy-httpd" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963937 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="sg-core" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963947 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-notification-agent" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963978 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.963988 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d2b656-e51c-4de2-9dbc-1d8b6d316fa2" containerName="cinder-api-log" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.964000 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" containerName="ceilometer-central-agent" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.966010 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.977933 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.978292 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.978092 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 18:39:04 crc kubenswrapper[4878]: I1202 18:39:04.999497 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.011163 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.055166 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-config-data" (OuterVolumeSpecName: "config-data") pod "6ea05342-9ff1-4890-91c2-8ca68502816d" (UID: "6ea05342-9ff1-4890-91c2-8ca68502816d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.106211 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-config-data\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.107893 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108089 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d7f991-50e6-47fa-8b4b-137022c03671-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108147 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108417 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108615 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-config-data-custom\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108677 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d7f991-50e6-47fa-8b4b-137022c03671-logs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108741 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-scripts\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108778 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddkq5\" (UniqueName: \"kubernetes.io/projected/60d7f991-50e6-47fa-8b4b-137022c03671-kube-api-access-ddkq5\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.108986 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea05342-9ff1-4890-91c2-8ca68502816d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216284 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d7f991-50e6-47fa-8b4b-137022c03671-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216339 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216473 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216551 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-config-data-custom\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216602 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d7f991-50e6-47fa-8b4b-137022c03671-logs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216637 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-scripts\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216660 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddkq5\" (UniqueName: \"kubernetes.io/projected/60d7f991-50e6-47fa-8b4b-137022c03671-kube-api-access-ddkq5\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216835 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-config-data\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.216861 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.218779 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60d7f991-50e6-47fa-8b4b-137022c03671-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.218858 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d7f991-50e6-47fa-8b4b-137022c03671-logs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.225039 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-scripts\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.226642 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-config-data-custom\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.237279 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.241608 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.251454 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-config-data\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.258754 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddkq5\" (UniqueName: \"kubernetes.io/projected/60d7f991-50e6-47fa-8b4b-137022c03671-kube-api-access-ddkq5\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.275166 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d7f991-50e6-47fa-8b4b-137022c03671-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60d7f991-50e6-47fa-8b4b-137022c03671\") " pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.376040 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.581353 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" event={"ID":"843928c8-a35c-43b5-a1e6-88199ac743cc","Type":"ContainerStarted","Data":"81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a"} Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.581401 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.628190 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" podStartSLOduration=5.628170975 podStartE2EDuration="5.628170975s" podCreationTimestamp="2025-12-02 18:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:05.621448884 +0000 UTC m=+1455.311067765" watchObservedRunningTime="2025-12-02 18:39:05.628170975 +0000 UTC m=+1455.317789856" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.667429 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.684535 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.695709 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.699100 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.701639 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.706658 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.707916 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.719398 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.719441 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.785208 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.804995 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.842556 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-log-httpd\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.842610 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.842639 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf499\" (UniqueName: \"kubernetes.io/projected/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-kube-api-access-tf499\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.842757 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.842791 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-run-httpd\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.842866 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-scripts\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.842884 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-config-data\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.850029 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.947039 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-scripts\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.947092 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-config-data\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.947180 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-log-httpd\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.947207 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.947228 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf499\" (UniqueName: \"kubernetes.io/projected/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-kube-api-access-tf499\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.947325 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.947358 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-run-httpd\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.948060 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-run-httpd\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.948704 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-log-httpd\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.953797 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.954279 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-scripts\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.961509 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.965449 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf499\" (UniqueName: \"kubernetes.io/projected/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-kube-api-access-tf499\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:05 crc kubenswrapper[4878]: I1202 18:39:05.974895 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-config-data\") pod \"ceilometer-0\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " pod="openstack/ceilometer-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.022577 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.602774 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.603191 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.785005 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.785073 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.826755 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.842441 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:06 crc kubenswrapper[4878]: I1202 18:39:06.977298 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea05342-9ff1-4890-91c2-8ca68502816d" path="/var/lib/kubelet/pods/6ea05342-9ff1-4890-91c2-8ca68502816d/volumes" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.018099 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-489zf"] Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.021337 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.028195 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-489zf"] Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.145199 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-catalog-content\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.145426 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-utilities\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.145566 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8n8x\" (UniqueName: \"kubernetes.io/projected/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-kube-api-access-j8n8x\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.247828 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8n8x\" (UniqueName: \"kubernetes.io/projected/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-kube-api-access-j8n8x\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.247957 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-catalog-content\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.248007 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-utilities\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.248575 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-utilities\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.248590 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-catalog-content\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.288291 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8n8x\" (UniqueName: \"kubernetes.io/projected/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-kube-api-access-j8n8x\") pod \"certified-operators-489zf\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.360446 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.640033 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.640658 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:07 crc kubenswrapper[4878]: I1202 18:39:07.940104 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.187081 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.307204 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-489zf"] Dec 02 18:39:08 crc kubenswrapper[4878]: W1202 18:39:08.313401 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158fd016_79a4_4bf1_90b3_20a4eae3d1d6.slice/crio-ad42f95b714ca48e726cfa42f0e07d46275d091e5422358987885c87db39e592 WatchSource:0}: Error finding container ad42f95b714ca48e726cfa42f0e07d46275d091e5422358987885c87db39e592: Status 404 returned error can't find the container with id ad42f95b714ca48e726cfa42f0e07d46275d091e5422358987885c87db39e592 Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.741465 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-78bfbd4977-mcqqx"] Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.745387 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60d7f991-50e6-47fa-8b4b-137022c03671","Type":"ContainerStarted","Data":"f77caba76abedb1fd4db2c9d9961177b8a5d9f73e266be097e3c994c9876cd73"} Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.783311 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerStarted","Data":"74bda4fcf8ab45107deb35271521d969a3c55bd581b091655ec7b59253f2e9b6"} Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.787636 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.787765 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68f754568d-brrc4" event={"ID":"144b91a5-fb67-4975-948b-33fc4d95a6ff","Type":"ContainerStarted","Data":"87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034"} Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.787895 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.787988 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-676cccd477-rncv9" event={"ID":"725117cc-7ed9-47ca-aabe-1238f836ee93","Type":"ContainerStarted","Data":"95f9ba2944618c98f4a21d9109bac5421362a34e46faba45ff601f659c859aac"} Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.788087 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-78bfbd4977-mcqqx"] Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.746421 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.795584 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerStarted","Data":"23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544"} Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.795630 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerStarted","Data":"ad42f95b714ca48e726cfa42f0e07d46275d091e5422358987885c87db39e592"} Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.818910 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxfc4\" (UniqueName: \"kubernetes.io/projected/242dcc1d-40de-4f68-8d3f-3dbe4b813506-kube-api-access-qxfc4\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.819153 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-combined-ca-bundle\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.819186 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.819221 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data-custom\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.848595 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5b86f6dd56-s4crw"] Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.853398 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.903028 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7c97596dc4-sl4jb"] Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.905474 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.921160 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-combined-ca-bundle\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.921253 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.921285 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data-custom\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.921390 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxfc4\" (UniqueName: \"kubernetes.io/projected/242dcc1d-40de-4f68-8d3f-3dbe4b813506-kube-api-access-qxfc4\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.992365 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data-custom\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:08 crc kubenswrapper[4878]: I1202 18:39:08.993075 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxfc4\" (UniqueName: \"kubernetes.io/projected/242dcc1d-40de-4f68-8d3f-3dbe4b813506-kube-api-access-qxfc4\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.002471 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.003584 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-combined-ca-bundle\") pod \"heat-engine-78bfbd4977-mcqqx\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.024993 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s5rz\" (UniqueName: \"kubernetes.io/projected/c2b1330f-94c6-4ea3-869c-02d6e52250ce-kube-api-access-5s5rz\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.025090 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.025128 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-combined-ca-bundle\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.025160 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data-custom\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.025184 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data-custom\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.025392 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpg84\" (UniqueName: \"kubernetes.io/projected/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-kube-api-access-rpg84\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.025435 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.025488 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-combined-ca-bundle\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.030644 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-68f754568d-brrc4" podStartSLOduration=4.899072127 podStartE2EDuration="9.020779593s" podCreationTimestamp="2025-12-02 18:39:00 +0000 UTC" firstStartedPulling="2025-12-02 18:39:03.191721836 +0000 UTC m=+1452.881340717" lastFinishedPulling="2025-12-02 18:39:07.313429302 +0000 UTC m=+1457.003048183" observedRunningTime="2025-12-02 18:39:08.790724077 +0000 UTC m=+1458.480342948" watchObservedRunningTime="2025-12-02 18:39:09.020779593 +0000 UTC m=+1458.710398474" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.121110 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7c97596dc4-sl4jb"] Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128226 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-combined-ca-bundle\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128380 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s5rz\" (UniqueName: \"kubernetes.io/projected/c2b1330f-94c6-4ea3-869c-02d6e52250ce-kube-api-access-5s5rz\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128438 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128486 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-combined-ca-bundle\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128508 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data-custom\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128526 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data-custom\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128572 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpg84\" (UniqueName: \"kubernetes.io/projected/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-kube-api-access-rpg84\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.128612 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.133850 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data-custom\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.133937 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b86f6dd56-s4crw"] Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.140781 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.144878 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-combined-ca-bundle\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.144879 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data-custom\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.150215 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-combined-ca-bundle\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.147206 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.156877 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-676cccd477-rncv9" podStartSLOduration=4.913595451 podStartE2EDuration="9.15685166s" podCreationTimestamp="2025-12-02 18:39:00 +0000 UTC" firstStartedPulling="2025-12-02 18:39:03.056718661 +0000 UTC m=+1452.746337542" lastFinishedPulling="2025-12-02 18:39:07.29997487 +0000 UTC m=+1456.989593751" observedRunningTime="2025-12-02 18:39:08.861861478 +0000 UTC m=+1458.551480359" watchObservedRunningTime="2025-12-02 18:39:09.15685166 +0000 UTC m=+1458.846470541" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.157767 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s5rz\" (UniqueName: \"kubernetes.io/projected/c2b1330f-94c6-4ea3-869c-02d6e52250ce-kube-api-access-5s5rz\") pod \"heat-cfnapi-5b86f6dd56-s4crw\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.159188 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.159189 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpg84\" (UniqueName: \"kubernetes.io/projected/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-kube-api-access-rpg84\") pod \"heat-api-7c97596dc4-sl4jb\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.307098 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.321144 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.864390 4878 generic.go:334] "Generic (PLEG): container finished" podID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerID="23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544" exitCode=0 Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.864819 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerDied","Data":"23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544"} Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.871273 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60d7f991-50e6-47fa-8b4b-137022c03671","Type":"ContainerStarted","Data":"4fd4c42f7cc3f29674044c73fc9001b1b8602fd450d5e2c6e0240aed8e6f242a"} Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.880257 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerStarted","Data":"6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0"} Dec 02 18:39:09 crc kubenswrapper[4878]: I1202 18:39:09.959763 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-78bfbd4977-mcqqx"] Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.201475 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b86f6dd56-s4crw"] Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.659296 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7c97596dc4-sl4jb"] Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.734343 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.851439 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.860124 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.860259 4878 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.863660 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 18:39:10 crc kubenswrapper[4878]: I1202 18:39:10.934021 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerStarted","Data":"aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905"} Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.095617 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bc8f75b9-pjhzn"] Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.096014 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.096034 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c97596dc4-sl4jb" event={"ID":"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c","Type":"ContainerStarted","Data":"a679d362fd3db591e83031b244e5789630aa80c5c03997d2f612f57bdd283585"} Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.096056 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78bfbd4977-mcqqx" event={"ID":"242dcc1d-40de-4f68-8d3f-3dbe4b813506","Type":"ContainerStarted","Data":"4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319"} Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.096069 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78bfbd4977-mcqqx" event={"ID":"242dcc1d-40de-4f68-8d3f-3dbe4b813506","Type":"ContainerStarted","Data":"045c4f2230e46d222127c554ff566b5386a7f0451399a0fb5520dbc2517c6497"} Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.096090 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" event={"ID":"c2b1330f-94c6-4ea3-869c-02d6e52250ce","Type":"ContainerStarted","Data":"898162f3640f482d8836006e0163ffa5a9ae1e6fd0c55d3c1b14ccc5ba437bfb"} Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.096101 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerStarted","Data":"1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36"} Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.096432 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" podUID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerName="dnsmasq-dns" containerID="cri-o://b78338e54cfc32f357a9923a1bc152544efe7935d8a007575cadba7c56cc8501" gracePeriod=10 Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.293176 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-78bfbd4977-mcqqx" podStartSLOduration=3.293151985 podStartE2EDuration="3.293151985s" podCreationTimestamp="2025-12-02 18:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:11.204592778 +0000 UTC m=+1460.894211659" watchObservedRunningTime="2025-12-02 18:39:11.293151985 +0000 UTC m=+1460.982770866" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.328926 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.329352 4878 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.339332 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.802213 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-676cccd477-rncv9"] Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.802729 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-676cccd477-rncv9" podUID="725117cc-7ed9-47ca-aabe-1238f836ee93" containerName="heat-api" containerID="cri-o://95f9ba2944618c98f4a21d9109bac5421362a34e46faba45ff601f659c859aac" gracePeriod=60 Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.862347 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-68f754568d-brrc4"] Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.863934 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-68f754568d-brrc4" podUID="144b91a5-fb67-4975-948b-33fc4d95a6ff" containerName="heat-cfnapi" containerID="cri-o://87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034" gracePeriod=60 Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.884883 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6464b89f4f-f5xbb"] Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.888214 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.899575 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.899831 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.909997 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6464b89f4f-f5xbb"] Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.960229 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6b48666bb6-kckkf"] Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.966399 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.973145 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.973373 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.978013 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data-custom\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.978128 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcwhl\" (UniqueName: \"kubernetes.io/projected/a072f732-97bd-4297-923d-beea0ac36e2a-kube-api-access-kcwhl\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.978348 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-internal-tls-certs\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.978410 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-combined-ca-bundle\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.978432 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-public-tls-certs\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:11 crc kubenswrapper[4878]: I1202 18:39:11.978475 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.010150 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b48666bb6-kckkf"] Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.088677 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.088825 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7btz\" (UniqueName: \"kubernetes.io/projected/2469eb4b-9de3-45f0-bc72-0a8add16fa57-kube-api-access-r7btz\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.088854 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-combined-ca-bundle\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.089011 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data-custom\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.089103 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-internal-tls-certs\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.089146 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcwhl\" (UniqueName: \"kubernetes.io/projected/a072f732-97bd-4297-923d-beea0ac36e2a-kube-api-access-kcwhl\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.089168 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data-custom\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.090127 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-public-tls-certs\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.092420 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-internal-tls-certs\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.092550 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-combined-ca-bundle\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.092591 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-public-tls-certs\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.092679 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.099643 4878 generic.go:334] "Generic (PLEG): container finished" podID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerID="1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36" exitCode=0 Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.099719 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerDied","Data":"1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36"} Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.122009 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60d7f991-50e6-47fa-8b4b-137022c03671","Type":"ContainerStarted","Data":"036c89cc7b6915e540b5e3b058a19b6580ba06755bd8f5ca0b2e0141668f3c13"} Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.123875 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.137801 4878 generic.go:334] "Generic (PLEG): container finished" podID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerID="b78338e54cfc32f357a9923a1bc152544efe7935d8a007575cadba7c56cc8501" exitCode=0 Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.137869 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" event={"ID":"f6e18d8b-bd8c-420d-b9af-84762e4c808e","Type":"ContainerDied","Data":"b78338e54cfc32f357a9923a1bc152544efe7935d8a007575cadba7c56cc8501"} Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.151545 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c97596dc4-sl4jb" event={"ID":"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c","Type":"ContainerStarted","Data":"0e6b89b1354d050268b16c1f1e4435202a3559d97ceab21a68f8bb462e3f05a3"} Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.152920 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.155976 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.155954617 podStartE2EDuration="8.155954617s" podCreationTimestamp="2025-12-02 18:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:12.147859553 +0000 UTC m=+1461.837478444" watchObservedRunningTime="2025-12-02 18:39:12.155954617 +0000 UTC m=+1461.845573498" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.164054 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" event={"ID":"c2b1330f-94c6-4ea3-869c-02d6e52250ce","Type":"ContainerStarted","Data":"93d2d112436ea49243578af86a9c7147ebf56d25be5d75827fb763768c950e87"} Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.164108 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.189813 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7c97596dc4-sl4jb" podStartSLOduration=4.189762007 podStartE2EDuration="4.189762007s" podCreationTimestamp="2025-12-02 18:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:12.175640064 +0000 UTC m=+1461.865258945" watchObservedRunningTime="2025-12-02 18:39:12.189762007 +0000 UTC m=+1461.879381258" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.195317 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-internal-tls-certs\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.195376 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data-custom\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.195471 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-public-tls-certs\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.195551 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.195598 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7btz\" (UniqueName: \"kubernetes.io/projected/2469eb4b-9de3-45f0-bc72-0a8add16fa57-kube-api-access-r7btz\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.195617 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-combined-ca-bundle\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.246369 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" podStartSLOduration=4.246349622 podStartE2EDuration="4.246349622s" podCreationTimestamp="2025-12-02 18:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:12.20100936 +0000 UTC m=+1461.890628241" watchObservedRunningTime="2025-12-02 18:39:12.246349622 +0000 UTC m=+1461.935968503" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.249407 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-internal-tls-certs\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.267441 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.268106 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data-custom\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.268968 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.272648 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-internal-tls-certs\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.273221 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-public-tls-certs\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.274108 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-combined-ca-bundle\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.278215 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcwhl\" (UniqueName: \"kubernetes.io/projected/a072f732-97bd-4297-923d-beea0ac36e2a-kube-api-access-kcwhl\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.281024 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-combined-ca-bundle\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.289461 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7btz\" (UniqueName: \"kubernetes.io/projected/2469eb4b-9de3-45f0-bc72-0a8add16fa57-kube-api-access-r7btz\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.296197 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data-custom\") pod \"heat-api-6464b89f4f-f5xbb\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.312287 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-public-tls-certs\") pod \"heat-cfnapi-6b48666bb6-kckkf\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.509974 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.564212 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:12 crc kubenswrapper[4878]: I1202 18:39:12.884166 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.034211 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn82b\" (UniqueName: \"kubernetes.io/projected/f6e18d8b-bd8c-420d-b9af-84762e4c808e-kube-api-access-mn82b\") pod \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.034299 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-sb\") pod \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.034465 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-config\") pod \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.034557 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-swift-storage-0\") pod \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.034582 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-svc\") pod \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.034762 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-nb\") pod \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\" (UID: \"f6e18d8b-bd8c-420d-b9af-84762e4c808e\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.041057 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.072827 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e18d8b-bd8c-420d-b9af-84762e4c808e-kube-api-access-mn82b" (OuterVolumeSpecName: "kube-api-access-mn82b") pod "f6e18d8b-bd8c-420d-b9af-84762e4c808e" (UID: "f6e18d8b-bd8c-420d-b9af-84762e4c808e"). InnerVolumeSpecName "kube-api-access-mn82b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.140319 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data\") pod \"144b91a5-fb67-4975-948b-33fc4d95a6ff\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.142512 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-combined-ca-bundle\") pod \"144b91a5-fb67-4975-948b-33fc4d95a6ff\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.142724 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data-custom\") pod \"144b91a5-fb67-4975-948b-33fc4d95a6ff\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.142804 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc5pt\" (UniqueName: \"kubernetes.io/projected/144b91a5-fb67-4975-948b-33fc4d95a6ff-kube-api-access-hc5pt\") pod \"144b91a5-fb67-4975-948b-33fc4d95a6ff\" (UID: \"144b91a5-fb67-4975-948b-33fc4d95a6ff\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.143970 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn82b\" (UniqueName: \"kubernetes.io/projected/f6e18d8b-bd8c-420d-b9af-84762e4c808e-kube-api-access-mn82b\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.175296 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144b91a5-fb67-4975-948b-33fc4d95a6ff-kube-api-access-hc5pt" (OuterVolumeSpecName: "kube-api-access-hc5pt") pod "144b91a5-fb67-4975-948b-33fc4d95a6ff" (UID: "144b91a5-fb67-4975-948b-33fc4d95a6ff"). InnerVolumeSpecName "kube-api-access-hc5pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.183021 4878 generic.go:334] "Generic (PLEG): container finished" podID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerID="93d2d112436ea49243578af86a9c7147ebf56d25be5d75827fb763768c950e87" exitCode=1 Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.183144 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" event={"ID":"c2b1330f-94c6-4ea3-869c-02d6e52250ce","Type":"ContainerDied","Data":"93d2d112436ea49243578af86a9c7147ebf56d25be5d75827fb763768c950e87"} Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.184105 4878 scope.go:117] "RemoveContainer" containerID="93d2d112436ea49243578af86a9c7147ebf56d25be5d75827fb763768c950e87" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.199600 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "144b91a5-fb67-4975-948b-33fc4d95a6ff" (UID: "144b91a5-fb67-4975-948b-33fc4d95a6ff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.203446 4878 generic.go:334] "Generic (PLEG): container finished" podID="725117cc-7ed9-47ca-aabe-1238f836ee93" containerID="95f9ba2944618c98f4a21d9109bac5421362a34e46faba45ff601f659c859aac" exitCode=0 Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.203530 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-676cccd477-rncv9" event={"ID":"725117cc-7ed9-47ca-aabe-1238f836ee93","Type":"ContainerDied","Data":"95f9ba2944618c98f4a21d9109bac5421362a34e46faba45ff601f659c859aac"} Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.219625 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerStarted","Data":"4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c"} Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.221618 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.222437 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65bc8f75b9-pjhzn" event={"ID":"f6e18d8b-bd8c-420d-b9af-84762e4c808e","Type":"ContainerDied","Data":"f96c934d53780d406ec0d05f3ee6f37e6ae09640edf7c92acacc4a3e6289c4d3"} Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.222594 4878 scope.go:117] "RemoveContainer" containerID="b78338e54cfc32f357a9923a1bc152544efe7935d8a007575cadba7c56cc8501" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.235470 4878 generic.go:334] "Generic (PLEG): container finished" podID="144b91a5-fb67-4975-948b-33fc4d95a6ff" containerID="87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034" exitCode=0 Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.235561 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68f754568d-brrc4" event={"ID":"144b91a5-fb67-4975-948b-33fc4d95a6ff","Type":"ContainerDied","Data":"87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034"} Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.235600 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68f754568d-brrc4" event={"ID":"144b91a5-fb67-4975-948b-33fc4d95a6ff","Type":"ContainerDied","Data":"cc2cb7744bb249051e7a61fb858efbb758d97c89dfb22b9a309282eebe180ac5"} Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.235691 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68f754568d-brrc4" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.241841 4878 generic.go:334] "Generic (PLEG): container finished" podID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerID="0e6b89b1354d050268b16c1f1e4435202a3559d97ceab21a68f8bb462e3f05a3" exitCode=1 Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.242424 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c97596dc4-sl4jb" event={"ID":"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c","Type":"ContainerDied","Data":"0e6b89b1354d050268b16c1f1e4435202a3559d97ceab21a68f8bb462e3f05a3"} Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.242999 4878 scope.go:117] "RemoveContainer" containerID="0e6b89b1354d050268b16c1f1e4435202a3559d97ceab21a68f8bb462e3f05a3" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.252412 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.252439 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc5pt\" (UniqueName: \"kubernetes.io/projected/144b91a5-fb67-4975-948b-33fc4d95a6ff-kube-api-access-hc5pt\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.272491 4878 scope.go:117] "RemoveContainer" containerID="af3b53b1de0e8858946297effb75fa04788e62d8f5eb022e30a201dda278de24" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.318319 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.321202 4878 scope.go:117] "RemoveContainer" containerID="87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.464299 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-combined-ca-bundle\") pod \"725117cc-7ed9-47ca-aabe-1238f836ee93\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.464617 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh9bq\" (UniqueName: \"kubernetes.io/projected/725117cc-7ed9-47ca-aabe-1238f836ee93-kube-api-access-gh9bq\") pod \"725117cc-7ed9-47ca-aabe-1238f836ee93\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.467470 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data-custom\") pod \"725117cc-7ed9-47ca-aabe-1238f836ee93\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.467612 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data\") pod \"725117cc-7ed9-47ca-aabe-1238f836ee93\" (UID: \"725117cc-7ed9-47ca-aabe-1238f836ee93\") " Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.474214 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "144b91a5-fb67-4975-948b-33fc4d95a6ff" (UID: "144b91a5-fb67-4975-948b-33fc4d95a6ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.474348 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6464b89f4f-f5xbb"] Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.479212 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.482387 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b48666bb6-kckkf"] Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.505413 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "725117cc-7ed9-47ca-aabe-1238f836ee93" (UID: "725117cc-7ed9-47ca-aabe-1238f836ee93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.509643 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725117cc-7ed9-47ca-aabe-1238f836ee93-kube-api-access-gh9bq" (OuterVolumeSpecName: "kube-api-access-gh9bq") pod "725117cc-7ed9-47ca-aabe-1238f836ee93" (UID: "725117cc-7ed9-47ca-aabe-1238f836ee93"). InnerVolumeSpecName "kube-api-access-gh9bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.565504 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f6e18d8b-bd8c-420d-b9af-84762e4c808e" (UID: "f6e18d8b-bd8c-420d-b9af-84762e4c808e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.570000 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-config" (OuterVolumeSpecName: "config") pod "f6e18d8b-bd8c-420d-b9af-84762e4c808e" (UID: "f6e18d8b-bd8c-420d-b9af-84762e4c808e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.571760 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f6e18d8b-bd8c-420d-b9af-84762e4c808e" (UID: "f6e18d8b-bd8c-420d-b9af-84762e4c808e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.585346 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.585680 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.585771 4878 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.587267 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.587380 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh9bq\" (UniqueName: \"kubernetes.io/projected/725117cc-7ed9-47ca-aabe-1238f836ee93-kube-api-access-gh9bq\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.651031 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f6e18d8b-bd8c-420d-b9af-84762e4c808e" (UID: "f6e18d8b-bd8c-420d-b9af-84762e4c808e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.654512 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data" (OuterVolumeSpecName: "config-data") pod "144b91a5-fb67-4975-948b-33fc4d95a6ff" (UID: "144b91a5-fb67-4975-948b-33fc4d95a6ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.667511 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "725117cc-7ed9-47ca-aabe-1238f836ee93" (UID: "725117cc-7ed9-47ca-aabe-1238f836ee93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.672933 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f6e18d8b-bd8c-420d-b9af-84762e4c808e" (UID: "f6e18d8b-bd8c-420d-b9af-84762e4c808e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.680380 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data" (OuterVolumeSpecName: "config-data") pod "725117cc-7ed9-47ca-aabe-1238f836ee93" (UID: "725117cc-7ed9-47ca-aabe-1238f836ee93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.689882 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144b91a5-fb67-4975-948b-33fc4d95a6ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.689915 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.689929 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6e18d8b-bd8c-420d-b9af-84762e4c808e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.689941 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.689950 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725117cc-7ed9-47ca-aabe-1238f836ee93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.897846 4878 scope.go:117] "RemoveContainer" containerID="87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034" Dec 02 18:39:13 crc kubenswrapper[4878]: E1202 18:39:13.904432 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034\": container with ID starting with 87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034 not found: ID does not exist" containerID="87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034" Dec 02 18:39:13 crc kubenswrapper[4878]: I1202 18:39:13.904481 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034"} err="failed to get container status \"87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034\": rpc error: code = NotFound desc = could not find container \"87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034\": container with ID starting with 87564f5f2ba793d45b9c524b872fb7c019fc886dbbd22f96c6dec24ff18f4034 not found: ID does not exist" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.168446 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65bc8f75b9-pjhzn"] Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.195137 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65bc8f75b9-pjhzn"] Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.217918 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-68f754568d-brrc4"] Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.246697 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-68f754568d-brrc4"] Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.309471 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.313905 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-676cccd477-rncv9" event={"ID":"725117cc-7ed9-47ca-aabe-1238f836ee93","Type":"ContainerDied","Data":"37a13647711776e9f3a85654c7e75ecd3e6d946c826719d66f90a0673a305d32"} Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.313981 4878 scope.go:117] "RemoveContainer" containerID="95f9ba2944618c98f4a21d9109bac5421362a34e46faba45ff601f659c859aac" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.314176 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-676cccd477-rncv9" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.321865 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.322517 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerStarted","Data":"f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e"} Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.349447 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6464b89f4f-f5xbb" event={"ID":"a072f732-97bd-4297-923d-beea0ac36e2a","Type":"ContainerStarted","Data":"26090dca63a9373d660ad6f688970c5f3d96be6aebb0212ec252dbea773255a3"} Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.373941 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" event={"ID":"2469eb4b-9de3-45f0-bc72-0a8add16fa57","Type":"ContainerStarted","Data":"fc8a6b7d1febfe59b30b6c9d7260c522e3fec44976a20645f0cf62fe5bd839ad"} Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.410607 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-489zf" podStartSLOduration=4.517682622 podStartE2EDuration="8.410584022s" podCreationTimestamp="2025-12-02 18:39:06 +0000 UTC" firstStartedPulling="2025-12-02 18:39:08.831854057 +0000 UTC m=+1458.521472968" lastFinishedPulling="2025-12-02 18:39:12.724755487 +0000 UTC m=+1462.414374368" observedRunningTime="2025-12-02 18:39:14.369183283 +0000 UTC m=+1464.058802174" watchObservedRunningTime="2025-12-02 18:39:14.410584022 +0000 UTC m=+1464.100202903" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.582850 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-676cccd477-rncv9"] Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.594642 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-676cccd477-rncv9"] Dec 02 18:39:14 crc kubenswrapper[4878]: E1202 18:39:14.691920 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e18d8b_bd8c_420d_b9af_84762e4c808e.slice/crio-f96c934d53780d406ec0d05f3ee6f37e6ae09640edf7c92acacc4a3e6289c4d3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod144b91a5_fb67_4975_948b_33fc4d95a6ff.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod725117cc_7ed9_47ca_aabe_1238f836ee93.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod144b91a5_fb67_4975_948b_33fc4d95a6ff.slice/crio-cc2cb7744bb249051e7a61fb858efbb758d97c89dfb22b9a309282eebe180ac5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e18d8b_bd8c_420d_b9af_84762e4c808e.slice\": RecentStats: unable to find data in memory cache]" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.955975 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144b91a5-fb67-4975-948b-33fc4d95a6ff" path="/var/lib/kubelet/pods/144b91a5-fb67-4975-948b-33fc4d95a6ff/volumes" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.956804 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725117cc-7ed9-47ca-aabe-1238f836ee93" path="/var/lib/kubelet/pods/725117cc-7ed9-47ca-aabe-1238f836ee93/volumes" Dec 02 18:39:14 crc kubenswrapper[4878]: I1202 18:39:14.957381 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" path="/var/lib/kubelet/pods/f6e18d8b-bd8c-420d-b9af-84762e4c808e/volumes" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.422440 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerStarted","Data":"a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c"} Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.422905 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-central-agent" containerID="cri-o://6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0" gracePeriod=30 Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.423210 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.423594 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="proxy-httpd" containerID="cri-o://a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c" gracePeriod=30 Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.423640 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="sg-core" containerID="cri-o://4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c" gracePeriod=30 Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.423675 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-notification-agent" containerID="cri-o://aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905" gracePeriod=30 Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.434043 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" event={"ID":"2469eb4b-9de3-45f0-bc72-0a8add16fa57","Type":"ContainerStarted","Data":"6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf"} Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.435977 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.443948 4878 generic.go:334] "Generic (PLEG): container finished" podID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerID="455a408a46b74f146eb30a79daacd3d8f1b0cde121ab2d2608f920d4fa99b91b" exitCode=1 Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.444072 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c97596dc4-sl4jb" event={"ID":"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c","Type":"ContainerDied","Data":"455a408a46b74f146eb30a79daacd3d8f1b0cde121ab2d2608f920d4fa99b91b"} Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.444150 4878 scope.go:117] "RemoveContainer" containerID="0e6b89b1354d050268b16c1f1e4435202a3559d97ceab21a68f8bb462e3f05a3" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.445441 4878 scope.go:117] "RemoveContainer" containerID="455a408a46b74f146eb30a79daacd3d8f1b0cde121ab2d2608f920d4fa99b91b" Dec 02 18:39:15 crc kubenswrapper[4878]: E1202 18:39:15.445945 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7c97596dc4-sl4jb_openstack(7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c)\"" pod="openstack/heat-api-7c97596dc4-sl4jb" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.453143 4878 generic.go:334] "Generic (PLEG): container finished" podID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerID="c718648b6e6d53ca2bd5fe7efc82dcc5a22fe7a68a44e9784d6978fd72286397" exitCode=1 Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.453253 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" event={"ID":"c2b1330f-94c6-4ea3-869c-02d6e52250ce","Type":"ContainerDied","Data":"c718648b6e6d53ca2bd5fe7efc82dcc5a22fe7a68a44e9784d6978fd72286397"} Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.454019 4878 scope.go:117] "RemoveContainer" containerID="c718648b6e6d53ca2bd5fe7efc82dcc5a22fe7a68a44e9784d6978fd72286397" Dec 02 18:39:15 crc kubenswrapper[4878]: E1202 18:39:15.454358 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5b86f6dd56-s4crw_openstack(c2b1330f-94c6-4ea3-869c-02d6e52250ce)\"" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.483323 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6464b89f4f-f5xbb" event={"ID":"a072f732-97bd-4297-923d-beea0ac36e2a","Type":"ContainerStarted","Data":"2e323d7b7c7f0e67c7e05473a00d49fa9b9b8415711d196da7bb1cf69bf9642e"} Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.483507 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.494935 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.863414661 podStartE2EDuration="10.494909302s" podCreationTimestamp="2025-12-02 18:39:05 +0000 UTC" firstStartedPulling="2025-12-02 18:39:07.963668236 +0000 UTC m=+1457.653287117" lastFinishedPulling="2025-12-02 18:39:13.595162877 +0000 UTC m=+1463.284781758" observedRunningTime="2025-12-02 18:39:15.477513986 +0000 UTC m=+1465.167132867" watchObservedRunningTime="2025-12-02 18:39:15.494909302 +0000 UTC m=+1465.184528183" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.567631 4878 scope.go:117] "RemoveContainer" containerID="93d2d112436ea49243578af86a9c7147ebf56d25be5d75827fb763768c950e87" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.585391 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" podStartSLOduration=4.585300767 podStartE2EDuration="4.585300767s" podCreationTimestamp="2025-12-02 18:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:15.56150279 +0000 UTC m=+1465.251121661" watchObservedRunningTime="2025-12-02 18:39:15.585300767 +0000 UTC m=+1465.274919658" Dec 02 18:39:15 crc kubenswrapper[4878]: I1202 18:39:15.609494 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6464b89f4f-f5xbb" podStartSLOduration=4.609475205 podStartE2EDuration="4.609475205s" podCreationTimestamp="2025-12-02 18:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:15.607199074 +0000 UTC m=+1465.296817955" watchObservedRunningTime="2025-12-02 18:39:15.609475205 +0000 UTC m=+1465.299094086" Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.510464 4878 generic.go:334] "Generic (PLEG): container finished" podID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerID="a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c" exitCode=0 Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.510508 4878 generic.go:334] "Generic (PLEG): container finished" podID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerID="4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c" exitCode=2 Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.510517 4878 generic.go:334] "Generic (PLEG): container finished" podID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerID="aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905" exitCode=0 Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.510566 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerDied","Data":"a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c"} Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.510601 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerDied","Data":"4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c"} Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.510615 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerDied","Data":"aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905"} Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.514042 4878 scope.go:117] "RemoveContainer" containerID="455a408a46b74f146eb30a79daacd3d8f1b0cde121ab2d2608f920d4fa99b91b" Dec 02 18:39:16 crc kubenswrapper[4878]: E1202 18:39:16.514526 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7c97596dc4-sl4jb_openstack(7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c)\"" pod="openstack/heat-api-7c97596dc4-sl4jb" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" Dec 02 18:39:16 crc kubenswrapper[4878]: I1202 18:39:16.516276 4878 scope.go:117] "RemoveContainer" containerID="c718648b6e6d53ca2bd5fe7efc82dcc5a22fe7a68a44e9784d6978fd72286397" Dec 02 18:39:16 crc kubenswrapper[4878]: E1202 18:39:16.516592 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5b86f6dd56-s4crw_openstack(c2b1330f-94c6-4ea3-869c-02d6e52250ce)\"" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" Dec 02 18:39:17 crc kubenswrapper[4878]: I1202 18:39:17.361103 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:17 crc kubenswrapper[4878]: I1202 18:39:17.365878 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.044162 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.213634 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-sg-core-conf-yaml\") pod \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214002 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-run-httpd\") pod \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214049 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-log-httpd\") pod \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214118 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-combined-ca-bundle\") pod \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214338 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-config-data\") pod \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214388 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-scripts\") pod \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214501 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf499\" (UniqueName: \"kubernetes.io/projected/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-kube-api-access-tf499\") pod \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\" (UID: \"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890\") " Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214563 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" (UID: "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.214825 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" (UID: "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.215676 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.215699 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.224911 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-scripts" (OuterVolumeSpecName: "scripts") pod "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" (UID: "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.236047 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-kube-api-access-tf499" (OuterVolumeSpecName: "kube-api-access-tf499") pod "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" (UID: "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890"). InnerVolumeSpecName "kube-api-access-tf499". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.266327 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" (UID: "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.318881 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.318917 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf499\" (UniqueName: \"kubernetes.io/projected/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-kube-api-access-tf499\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.318931 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.329091 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" (UID: "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.385902 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-config-data" (OuterVolumeSpecName: "config-data") pod "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" (UID: "50bea8af-8ebe-4e0b-9e34-45f8f4dcc890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.421652 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.421691 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.433961 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-489zf" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="registry-server" probeResult="failure" output=< Dec 02 18:39:18 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 18:39:18 crc kubenswrapper[4878]: > Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.544692 4878 generic.go:334] "Generic (PLEG): container finished" podID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerID="6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0" exitCode=0 Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.544767 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerDied","Data":"6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0"} Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.544797 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.545205 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50bea8af-8ebe-4e0b-9e34-45f8f4dcc890","Type":"ContainerDied","Data":"74bda4fcf8ab45107deb35271521d969a3c55bd581b091655ec7b59253f2e9b6"} Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.545258 4878 scope.go:117] "RemoveContainer" containerID="a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.574410 4878 scope.go:117] "RemoveContainer" containerID="4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.638296 4878 scope.go:117] "RemoveContainer" containerID="aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.647251 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.658926 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.660696 4878 scope.go:117] "RemoveContainer" containerID="6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.681137 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682260 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerName="dnsmasq-dns" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682281 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerName="dnsmasq-dns" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682298 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerName="init" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682344 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerName="init" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682367 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="proxy-httpd" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682374 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="proxy-httpd" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682386 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-central-agent" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682392 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-central-agent" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682425 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="sg-core" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682431 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="sg-core" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682444 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-notification-agent" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682450 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-notification-agent" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682457 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144b91a5-fb67-4975-948b-33fc4d95a6ff" containerName="heat-cfnapi" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682463 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="144b91a5-fb67-4975-948b-33fc4d95a6ff" containerName="heat-cfnapi" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.682487 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725117cc-7ed9-47ca-aabe-1238f836ee93" containerName="heat-api" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682493 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="725117cc-7ed9-47ca-aabe-1238f836ee93" containerName="heat-api" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682766 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="sg-core" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682802 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="725117cc-7ed9-47ca-aabe-1238f836ee93" containerName="heat-api" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682818 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-central-agent" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682831 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="ceilometer-notification-agent" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682842 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e18d8b-bd8c-420d-b9af-84762e4c808e" containerName="dnsmasq-dns" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682852 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="144b91a5-fb67-4975-948b-33fc4d95a6ff" containerName="heat-cfnapi" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.682878 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" containerName="proxy-httpd" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.686080 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.686159 4878 scope.go:117] "RemoveContainer" containerID="a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.687046 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c\": container with ID starting with a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c not found: ID does not exist" containerID="a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.687710 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c"} err="failed to get container status \"a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c\": rpc error: code = NotFound desc = could not find container \"a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c\": container with ID starting with a3bd43e44fe64e459c5dc3590d96e256e95663d73bbca44c891253665d13899c not found: ID does not exist" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.687761 4878 scope.go:117] "RemoveContainer" containerID="4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.690270 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c\": container with ID starting with 4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c not found: ID does not exist" containerID="4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.690326 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c"} err="failed to get container status \"4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c\": rpc error: code = NotFound desc = could not find container \"4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c\": container with ID starting with 4d1bf8190f77a22f94f20700972387a57357ec1c0d42f45717ac77bed4cf3b6c not found: ID does not exist" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.690365 4878 scope.go:117] "RemoveContainer" containerID="aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.690495 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.690653 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905\": container with ID starting with aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905 not found: ID does not exist" containerID="aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.690680 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905"} err="failed to get container status \"aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905\": rpc error: code = NotFound desc = could not find container \"aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905\": container with ID starting with aeaaf1994ceb641cbca5daefd15c47728246bc54334246266707223f32c8a905 not found: ID does not exist" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.690699 4878 scope.go:117] "RemoveContainer" containerID="6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0" Dec 02 18:39:18 crc kubenswrapper[4878]: E1202 18:39:18.690928 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0\": container with ID starting with 6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0 not found: ID does not exist" containerID="6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.690961 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0"} err="failed to get container status \"6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0\": rpc error: code = NotFound desc = could not find container \"6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0\": container with ID starting with 6288a227393f2577627b2d72f066aa0bc52342d1b27f20b6b58470222d4147e0 not found: ID does not exist" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.694918 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.703396 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.830921 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-log-httpd\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.830972 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.831100 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5vzs\" (UniqueName: \"kubernetes.io/projected/58d7b8ef-c077-4e32-9345-3b402def9fce-kube-api-access-h5vzs\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.832368 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-config-data\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.832572 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-run-httpd\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.832607 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.832653 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-scripts\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.934877 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-config-data\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.934990 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-run-httpd\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.935016 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.935049 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-scripts\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.935072 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-log-httpd\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.935089 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.935150 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5vzs\" (UniqueName: \"kubernetes.io/projected/58d7b8ef-c077-4e32-9345-3b402def9fce-kube-api-access-h5vzs\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.936009 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-log-httpd\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.936048 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-run-httpd\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.940946 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.941806 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.942838 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-config-data\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.944820 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-scripts\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.956345 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50bea8af-8ebe-4e0b-9e34-45f8f4dcc890" path="/var/lib/kubelet/pods/50bea8af-8ebe-4e0b-9e34-45f8f4dcc890/volumes" Dec 02 18:39:18 crc kubenswrapper[4878]: I1202 18:39:18.960161 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5vzs\" (UniqueName: \"kubernetes.io/projected/58d7b8ef-c077-4e32-9345-3b402def9fce-kube-api-access-h5vzs\") pod \"ceilometer-0\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " pod="openstack/ceilometer-0" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.046331 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.307938 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.308406 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.309606 4878 scope.go:117] "RemoveContainer" containerID="c718648b6e6d53ca2bd5fe7efc82dcc5a22fe7a68a44e9784d6978fd72286397" Dec 02 18:39:19 crc kubenswrapper[4878]: E1202 18:39:19.310033 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5b86f6dd56-s4crw_openstack(c2b1330f-94c6-4ea3-869c-02d6e52250ce)\"" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.321721 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.322902 4878 scope.go:117] "RemoveContainer" containerID="455a408a46b74f146eb30a79daacd3d8f1b0cde121ab2d2608f920d4fa99b91b" Dec 02 18:39:19 crc kubenswrapper[4878]: E1202 18:39:19.323224 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7c97596dc4-sl4jb_openstack(7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c)\"" pod="openstack/heat-api-7c97596dc4-sl4jb" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.325358 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.557761 4878 scope.go:117] "RemoveContainer" containerID="455a408a46b74f146eb30a79daacd3d8f1b0cde121ab2d2608f920d4fa99b91b" Dec 02 18:39:19 crc kubenswrapper[4878]: E1202 18:39:19.558143 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7c97596dc4-sl4jb_openstack(7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c)\"" pod="openstack/heat-api-7c97596dc4-sl4jb" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" Dec 02 18:39:19 crc kubenswrapper[4878]: I1202 18:39:19.596660 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.406969 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jfrhp"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.409005 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.451437 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jfrhp"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.528578 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.534852 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0faa-account-create-update-tdxld"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.536981 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.541484 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.573724 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0faa-account-create-update-tdxld"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.591607 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fcv\" (UniqueName: \"kubernetes.io/projected/ebe98b91-cced-4d28-b2cb-9e19d827a817-kube-api-access-r2fcv\") pod \"nova-api-db-create-jfrhp\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.591680 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe98b91-cced-4d28-b2cb-9e19d827a817-operator-scripts\") pod \"nova-api-db-create-jfrhp\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.599484 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerStarted","Data":"bb0b37d40e8358eb50cadee67443b1c388d1beea2cfc1cebe06e4ee4a8786429"} Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.600745 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-49dbg"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.613008 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.648066 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-49dbg"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.698200 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-operator-scripts\") pod \"nova-cell0-db-create-49dbg\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.698577 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-operator-scripts\") pod \"nova-api-0faa-account-create-update-tdxld\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.698744 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn29r\" (UniqueName: \"kubernetes.io/projected/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-kube-api-access-zn29r\") pod \"nova-api-0faa-account-create-update-tdxld\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.698824 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqngw\" (UniqueName: \"kubernetes.io/projected/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-kube-api-access-hqngw\") pod \"nova-cell0-db-create-49dbg\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.698870 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fcv\" (UniqueName: \"kubernetes.io/projected/ebe98b91-cced-4d28-b2cb-9e19d827a817-kube-api-access-r2fcv\") pod \"nova-api-db-create-jfrhp\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.698909 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe98b91-cced-4d28-b2cb-9e19d827a817-operator-scripts\") pod \"nova-api-db-create-jfrhp\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.704416 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe98b91-cced-4d28-b2cb-9e19d827a817-operator-scripts\") pod \"nova-api-db-create-jfrhp\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.734789 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fcv\" (UniqueName: \"kubernetes.io/projected/ebe98b91-cced-4d28-b2cb-9e19d827a817-kube-api-access-r2fcv\") pod \"nova-api-db-create-jfrhp\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.736564 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.767718 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5vk6v"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.772229 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.791316 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5vk6v"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.801856 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-operator-scripts\") pod \"nova-cell0-db-create-49dbg\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.801928 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtx6h\" (UniqueName: \"kubernetes.io/projected/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-kube-api-access-jtx6h\") pod \"nova-cell1-db-create-5vk6v\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.802023 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-operator-scripts\") pod \"nova-cell1-db-create-5vk6v\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.802079 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-operator-scripts\") pod \"nova-api-0faa-account-create-update-tdxld\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.802197 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn29r\" (UniqueName: \"kubernetes.io/projected/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-kube-api-access-zn29r\") pod \"nova-api-0faa-account-create-update-tdxld\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.802277 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqngw\" (UniqueName: \"kubernetes.io/projected/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-kube-api-access-hqngw\") pod \"nova-cell0-db-create-49dbg\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.803501 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-operator-scripts\") pod \"nova-api-0faa-account-create-update-tdxld\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.803823 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-operator-scripts\") pod \"nova-cell0-db-create-49dbg\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.818090 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9c84-account-create-update-jft6n"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.820370 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.829572 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.834047 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn29r\" (UniqueName: \"kubernetes.io/projected/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-kube-api-access-zn29r\") pod \"nova-api-0faa-account-create-update-tdxld\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.838680 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqngw\" (UniqueName: \"kubernetes.io/projected/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-kube-api-access-hqngw\") pod \"nova-cell0-db-create-49dbg\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.857745 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9c84-account-create-update-jft6n"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.895667 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.905760 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-operator-scripts\") pod \"nova-cell0-9c84-account-create-update-jft6n\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.905877 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtx6h\" (UniqueName: \"kubernetes.io/projected/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-kube-api-access-jtx6h\") pod \"nova-cell1-db-create-5vk6v\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.905969 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-operator-scripts\") pod \"nova-cell1-db-create-5vk6v\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.906093 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fprr\" (UniqueName: \"kubernetes.io/projected/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-kube-api-access-9fprr\") pod \"nova-cell0-9c84-account-create-update-jft6n\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.916614 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-operator-scripts\") pod \"nova-cell1-db-create-5vk6v\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.936098 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtx6h\" (UniqueName: \"kubernetes.io/projected/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-kube-api-access-jtx6h\") pod \"nova-cell1-db-create-5vk6v\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.976115 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.981988 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d658-account-create-update-lrvbf"] Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.988547 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:20 crc kubenswrapper[4878]: I1202 18:39:20.994663 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.015515 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d658-account-create-update-lrvbf"] Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.017922 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-operator-scripts\") pod \"nova-cell0-9c84-account-create-update-jft6n\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.020920 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fprr\" (UniqueName: \"kubernetes.io/projected/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-kube-api-access-9fprr\") pod \"nova-cell0-9c84-account-create-update-jft6n\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.021750 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-operator-scripts\") pod \"nova-cell0-9c84-account-create-update-jft6n\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.050384 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.124091 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hf5\" (UniqueName: \"kubernetes.io/projected/0fae1bb0-8b82-44a5-871d-45252562e8a7-kube-api-access-d8hf5\") pod \"nova-cell1-d658-account-create-update-lrvbf\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.130101 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fae1bb0-8b82-44a5-871d-45252562e8a7-operator-scripts\") pod \"nova-cell1-d658-account-create-update-lrvbf\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.131627 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fprr\" (UniqueName: \"kubernetes.io/projected/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-kube-api-access-9fprr\") pod \"nova-cell0-9c84-account-create-update-jft6n\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.175735 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.314115 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hf5\" (UniqueName: \"kubernetes.io/projected/0fae1bb0-8b82-44a5-871d-45252562e8a7-kube-api-access-d8hf5\") pod \"nova-cell1-d658-account-create-update-lrvbf\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.314537 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fae1bb0-8b82-44a5-871d-45252562e8a7-operator-scripts\") pod \"nova-cell1-d658-account-create-update-lrvbf\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.319402 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fae1bb0-8b82-44a5-871d-45252562e8a7-operator-scripts\") pod \"nova-cell1-d658-account-create-update-lrvbf\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.367575 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hf5\" (UniqueName: \"kubernetes.io/projected/0fae1bb0-8b82-44a5-871d-45252562e8a7-kube-api-access-d8hf5\") pod \"nova-cell1-d658-account-create-update-lrvbf\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.610905 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.676670 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerStarted","Data":"dbfd6fcb4461b04362a2ccb1c131969624cad4e3fb350c3e497f97f80ead7749"} Dec 02 18:39:21 crc kubenswrapper[4878]: I1202 18:39:21.778152 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jfrhp"] Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.023607 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0faa-account-create-update-tdxld"] Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.064513 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5vk6v"] Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.253016 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-49dbg"] Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.397262 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9c84-account-create-update-jft6n"] Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.608423 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d658-account-create-update-lrvbf"] Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.701390 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" event={"ID":"0fae1bb0-8b82-44a5-871d-45252562e8a7","Type":"ContainerStarted","Data":"04038dc62e16aa20b9394bac821d8d81e9ae5eac6c2580253884914aced8ceeb"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.705711 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0faa-account-create-update-tdxld" event={"ID":"de5fe1e3-5ab4-40e9-a902-f1b444bd005f","Type":"ContainerStarted","Data":"8bda8446d680947e9b42c3ea59423ab562bb02282b0dcbe2843ce4b1e21c1933"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.711341 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jfrhp" event={"ID":"ebe98b91-cced-4d28-b2cb-9e19d827a817","Type":"ContainerStarted","Data":"12efed16e60a9b04a5923054741b7dbf61a6d4ff4cd17d94560863667f113b69"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.725046 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5vk6v" event={"ID":"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b","Type":"ContainerStarted","Data":"cbf268c1bad30d1042f263fc83d37c17103b2a5dce3b74e1f46d6044274aa11a"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.725095 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5vk6v" event={"ID":"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b","Type":"ContainerStarted","Data":"626b98104382ee165994b905b0ea1fca802375a21f45eb5487255297df69e3c5"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.746415 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-49dbg" event={"ID":"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4","Type":"ContainerStarted","Data":"d1091808de4bec662b5993db3b85d70c4b744667b3459b4c98abbc7a8fa551c2"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.755412 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-jfrhp" podStartSLOduration=2.7553878149999997 podStartE2EDuration="2.755387815s" podCreationTimestamp="2025-12-02 18:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:22.736818472 +0000 UTC m=+1472.426437353" watchObservedRunningTime="2025-12-02 18:39:22.755387815 +0000 UTC m=+1472.445006696" Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.772345 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerStarted","Data":"c032305040d4a4f71922dfdd6e4e49068a6713b5664bd177a5292ef58c36f944"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.792708 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9c84-account-create-update-jft6n" event={"ID":"b863f92e-dc47-4a8e-b9ae-31baafb9ec79","Type":"ContainerStarted","Data":"f00039848379fa7b3ffa1db5465d6f1f20d9188d1ca1bfff9eb98224ceb37b4b"} Dec 02 18:39:22 crc kubenswrapper[4878]: I1202 18:39:22.819457 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-5vk6v" podStartSLOduration=2.8194308230000003 podStartE2EDuration="2.819430823s" podCreationTimestamp="2025-12-02 18:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:22.764691466 +0000 UTC m=+1472.454310347" watchObservedRunningTime="2025-12-02 18:39:22.819430823 +0000 UTC m=+1472.509049704" Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.744656 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.746279 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.746423 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.747575 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f262fb1e8290073f98cb5506d7d41d0ed3eb91d64ad36acc8496dd8b9fa35544"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.747721 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://f262fb1e8290073f98cb5506d7d41d0ed3eb91d64ad36acc8496dd8b9fa35544" gracePeriod=600 Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.909585 4878 generic.go:334] "Generic (PLEG): container finished" podID="b863f92e-dc47-4a8e-b9ae-31baafb9ec79" containerID="c902a8982e64a4d62748c1089097106615b250f89ab1e922cf98bb99f514e141" exitCode=0 Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.909733 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9c84-account-create-update-jft6n" event={"ID":"b863f92e-dc47-4a8e-b9ae-31baafb9ec79","Type":"ContainerDied","Data":"c902a8982e64a4d62748c1089097106615b250f89ab1e922cf98bb99f514e141"} Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.929284 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" event={"ID":"0fae1bb0-8b82-44a5-871d-45252562e8a7","Type":"ContainerStarted","Data":"3ad46b90815ff249895d7a99e9f1cc105354805adaab707505502d3c335d3037"} Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.950068 4878 generic.go:334] "Generic (PLEG): container finished" podID="de5fe1e3-5ab4-40e9-a902-f1b444bd005f" containerID="39af2f3b2e4cb81d0db802b1c60b2ba35f1c3aa0de46d79548564efc545cdb50" exitCode=0 Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.950554 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0faa-account-create-update-tdxld" event={"ID":"de5fe1e3-5ab4-40e9-a902-f1b444bd005f","Type":"ContainerDied","Data":"39af2f3b2e4cb81d0db802b1c60b2ba35f1c3aa0de46d79548564efc545cdb50"} Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.985407 4878 generic.go:334] "Generic (PLEG): container finished" podID="ebe98b91-cced-4d28-b2cb-9e19d827a817" containerID="73146bbc2ac059d567cfd3a3597b850aaae09a42e1da3f35e704ff5aafc41fb2" exitCode=0 Dec 02 18:39:23 crc kubenswrapper[4878]: I1202 18:39:23.985502 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jfrhp" event={"ID":"ebe98b91-cced-4d28-b2cb-9e19d827a817","Type":"ContainerDied","Data":"73146bbc2ac059d567cfd3a3597b850aaae09a42e1da3f35e704ff5aafc41fb2"} Dec 02 18:39:24 crc kubenswrapper[4878]: I1202 18:39:24.010544 4878 generic.go:334] "Generic (PLEG): container finished" podID="b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b" containerID="cbf268c1bad30d1042f263fc83d37c17103b2a5dce3b74e1f46d6044274aa11a" exitCode=0 Dec 02 18:39:24 crc kubenswrapper[4878]: I1202 18:39:24.010653 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5vk6v" event={"ID":"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b","Type":"ContainerDied","Data":"cbf268c1bad30d1042f263fc83d37c17103b2a5dce3b74e1f46d6044274aa11a"} Dec 02 18:39:24 crc kubenswrapper[4878]: I1202 18:39:24.031969 4878 generic.go:334] "Generic (PLEG): container finished" podID="51a22bf8-3e2b-4bd7-bbda-2cf7301065f4" containerID="a6ded572eededc1ff51f73116a21e19978d27d32b12f5ab60352586d90251cf3" exitCode=0 Dec 02 18:39:24 crc kubenswrapper[4878]: I1202 18:39:24.032123 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-49dbg" event={"ID":"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4","Type":"ContainerDied","Data":"a6ded572eededc1ff51f73116a21e19978d27d32b12f5ab60352586d90251cf3"} Dec 02 18:39:24 crc kubenswrapper[4878]: I1202 18:39:24.103054 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerStarted","Data":"8927a329748c3e80e2c313cc2a89d3480ea0fa38674c8f9286fa7ab9d2a5e5a9"} Dec 02 18:39:24 crc kubenswrapper[4878]: I1202 18:39:24.503195 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.175001 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerStarted","Data":"e216f8dfe39e9b1c63a014c29c0a83014f45d4563db5eb82702e3add216e379f"} Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.177550 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.182366 4878 generic.go:334] "Generic (PLEG): container finished" podID="0fae1bb0-8b82-44a5-871d-45252562e8a7" containerID="3ad46b90815ff249895d7a99e9f1cc105354805adaab707505502d3c335d3037" exitCode=0 Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.182419 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" event={"ID":"0fae1bb0-8b82-44a5-871d-45252562e8a7","Type":"ContainerDied","Data":"3ad46b90815ff249895d7a99e9f1cc105354805adaab707505502d3c335d3037"} Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.199229 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="f262fb1e8290073f98cb5506d7d41d0ed3eb91d64ad36acc8496dd8b9fa35544" exitCode=0 Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.199507 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"f262fb1e8290073f98cb5506d7d41d0ed3eb91d64ad36acc8496dd8b9fa35544"} Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.199542 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f"} Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.199580 4878 scope.go:117] "RemoveContainer" containerID="26923c15a81965f0afaf8fe206a0c93db8beb9097433b36b1189c363d7056d26" Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.220213 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.38441222 podStartE2EDuration="7.220187172s" podCreationTimestamp="2025-12-02 18:39:18 +0000 UTC" firstStartedPulling="2025-12-02 18:39:19.611546119 +0000 UTC m=+1469.301165000" lastFinishedPulling="2025-12-02 18:39:24.447321071 +0000 UTC m=+1474.136939952" observedRunningTime="2025-12-02 18:39:25.20098661 +0000 UTC m=+1474.890605491" watchObservedRunningTime="2025-12-02 18:39:25.220187172 +0000 UTC m=+1474.909806053" Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.785372 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.860938 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:39:25 crc kubenswrapper[4878]: I1202 18:39:25.893350 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7c97596dc4-sl4jb"] Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.035832 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5b86f6dd56-s4crw"] Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.145710 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.227637 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" event={"ID":"0fae1bb0-8b82-44a5-871d-45252562e8a7","Type":"ContainerDied","Data":"04038dc62e16aa20b9394bac821d8d81e9ae5eac6c2580253884914aced8ceeb"} Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.228952 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04038dc62e16aa20b9394bac821d8d81e9ae5eac6c2580253884914aced8ceeb" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.229023 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d658-account-create-update-lrvbf" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.230337 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8hf5\" (UniqueName: \"kubernetes.io/projected/0fae1bb0-8b82-44a5-871d-45252562e8a7-kube-api-access-d8hf5\") pod \"0fae1bb0-8b82-44a5-871d-45252562e8a7\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.230621 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fae1bb0-8b82-44a5-871d-45252562e8a7-operator-scripts\") pod \"0fae1bb0-8b82-44a5-871d-45252562e8a7\" (UID: \"0fae1bb0-8b82-44a5-871d-45252562e8a7\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.231125 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fae1bb0-8b82-44a5-871d-45252562e8a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fae1bb0-8b82-44a5-871d-45252562e8a7" (UID: "0fae1bb0-8b82-44a5-871d-45252562e8a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.240738 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fae1bb0-8b82-44a5-871d-45252562e8a7-kube-api-access-d8hf5" (OuterVolumeSpecName: "kube-api-access-d8hf5") pod "0fae1bb0-8b82-44a5-871d-45252562e8a7" (UID: "0fae1bb0-8b82-44a5-871d-45252562e8a7"). InnerVolumeSpecName "kube-api-access-d8hf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.335188 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8hf5\" (UniqueName: \"kubernetes.io/projected/0fae1bb0-8b82-44a5-871d-45252562e8a7-kube-api-access-d8hf5\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.335229 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fae1bb0-8b82-44a5-871d-45252562e8a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.513258 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.534053 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.549140 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-operator-scripts\") pod \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.549355 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtx6h\" (UniqueName: \"kubernetes.io/projected/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-kube-api-access-jtx6h\") pod \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\" (UID: \"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.556186 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.556423 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-kube-api-access-jtx6h" (OuterVolumeSpecName: "kube-api-access-jtx6h") pod "b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b" (UID: "b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b"). InnerVolumeSpecName "kube-api-access-jtx6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.552174 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b" (UID: "b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.557360 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtx6h\" (UniqueName: \"kubernetes.io/projected/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-kube-api-access-jtx6h\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.557390 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.660926 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqngw\" (UniqueName: \"kubernetes.io/projected/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-kube-api-access-hqngw\") pod \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.661343 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-operator-scripts\") pod \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\" (UID: \"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.661456 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe98b91-cced-4d28-b2cb-9e19d827a817-operator-scripts\") pod \"ebe98b91-cced-4d28-b2cb-9e19d827a817\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.661646 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2fcv\" (UniqueName: \"kubernetes.io/projected/ebe98b91-cced-4d28-b2cb-9e19d827a817-kube-api-access-r2fcv\") pod \"ebe98b91-cced-4d28-b2cb-9e19d827a817\" (UID: \"ebe98b91-cced-4d28-b2cb-9e19d827a817\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.663078 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51a22bf8-3e2b-4bd7-bbda-2cf7301065f4" (UID: "51a22bf8-3e2b-4bd7-bbda-2cf7301065f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.663587 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe98b91-cced-4d28-b2cb-9e19d827a817-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ebe98b91-cced-4d28-b2cb-9e19d827a817" (UID: "ebe98b91-cced-4d28-b2cb-9e19d827a817"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.664230 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.664278 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe98b91-cced-4d28-b2cb-9e19d827a817-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.675123 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-kube-api-access-hqngw" (OuterVolumeSpecName: "kube-api-access-hqngw") pod "51a22bf8-3e2b-4bd7-bbda-2cf7301065f4" (UID: "51a22bf8-3e2b-4bd7-bbda-2cf7301065f4"). InnerVolumeSpecName "kube-api-access-hqngw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.675320 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe98b91-cced-4d28-b2cb-9e19d827a817-kube-api-access-r2fcv" (OuterVolumeSpecName: "kube-api-access-r2fcv") pod "ebe98b91-cced-4d28-b2cb-9e19d827a817" (UID: "ebe98b91-cced-4d28-b2cb-9e19d827a817"). InnerVolumeSpecName "kube-api-access-r2fcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.766541 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqngw\" (UniqueName: \"kubernetes.io/projected/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4-kube-api-access-hqngw\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.766581 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2fcv\" (UniqueName: \"kubernetes.io/projected/ebe98b91-cced-4d28-b2cb-9e19d827a817-kube-api-access-r2fcv\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.868716 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.931390 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.970682 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fprr\" (UniqueName: \"kubernetes.io/projected/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-kube-api-access-9fprr\") pod \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.970885 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-operator-scripts\") pod \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\" (UID: \"b863f92e-dc47-4a8e-b9ae-31baafb9ec79\") " Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.977386 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b863f92e-dc47-4a8e-b9ae-31baafb9ec79" (UID: "b863f92e-dc47-4a8e-b9ae-31baafb9ec79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:26 crc kubenswrapper[4878]: I1202 18:39:26.981491 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-kube-api-access-9fprr" (OuterVolumeSpecName: "kube-api-access-9fprr") pod "b863f92e-dc47-4a8e-b9ae-31baafb9ec79" (UID: "b863f92e-dc47-4a8e-b9ae-31baafb9ec79"). InnerVolumeSpecName "kube-api-access-9fprr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.073803 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-operator-scripts\") pod \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.073916 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn29r\" (UniqueName: \"kubernetes.io/projected/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-kube-api-access-zn29r\") pod \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\" (UID: \"de5fe1e3-5ab4-40e9-a902-f1b444bd005f\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.075370 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fprr\" (UniqueName: \"kubernetes.io/projected/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-kube-api-access-9fprr\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.075396 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b863f92e-dc47-4a8e-b9ae-31baafb9ec79-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.075528 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de5fe1e3-5ab4-40e9-a902-f1b444bd005f" (UID: "de5fe1e3-5ab4-40e9-a902-f1b444bd005f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.087825 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-kube-api-access-zn29r" (OuterVolumeSpecName: "kube-api-access-zn29r") pod "de5fe1e3-5ab4-40e9-a902-f1b444bd005f" (UID: "de5fe1e3-5ab4-40e9-a902-f1b444bd005f"). InnerVolumeSpecName "kube-api-access-zn29r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.177356 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.177392 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn29r\" (UniqueName: \"kubernetes.io/projected/de5fe1e3-5ab4-40e9-a902-f1b444bd005f-kube-api-access-zn29r\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.205176 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.241754 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.280994 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data\") pod \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.281144 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-combined-ca-bundle\") pod \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.281279 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data-custom\") pod \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.281312 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-combined-ca-bundle\") pod \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.281383 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s5rz\" (UniqueName: \"kubernetes.io/projected/c2b1330f-94c6-4ea3-869c-02d6e52250ce-kube-api-access-5s5rz\") pod \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\" (UID: \"c2b1330f-94c6-4ea3-869c-02d6e52250ce\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.281407 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data\") pod \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.281463 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data-custom\") pod \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.281509 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpg84\" (UniqueName: \"kubernetes.io/projected/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-kube-api-access-rpg84\") pod \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\" (UID: \"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c\") " Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.307512 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-kube-api-access-rpg84" (OuterVolumeSpecName: "kube-api-access-rpg84") pod "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" (UID: "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c"). InnerVolumeSpecName "kube-api-access-rpg84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.316306 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5vk6v" event={"ID":"b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b","Type":"ContainerDied","Data":"626b98104382ee165994b905b0ea1fca802375a21f45eb5487255297df69e3c5"} Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.316391 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="626b98104382ee165994b905b0ea1fca802375a21f45eb5487255297df69e3c5" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.316525 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5vk6v" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.322428 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-49dbg" event={"ID":"51a22bf8-3e2b-4bd7-bbda-2cf7301065f4","Type":"ContainerDied","Data":"d1091808de4bec662b5993db3b85d70c4b744667b3459b4c98abbc7a8fa551c2"} Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.322642 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1091808de4bec662b5993db3b85d70c4b744667b3459b4c98abbc7a8fa551c2" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.322846 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-49dbg" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.324932 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2b1330f-94c6-4ea3-869c-02d6e52250ce" (UID: "c2b1330f-94c6-4ea3-869c-02d6e52250ce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.332131 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" (UID: "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.333413 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9c84-account-create-update-jft6n" event={"ID":"b863f92e-dc47-4a8e-b9ae-31baafb9ec79","Type":"ContainerDied","Data":"f00039848379fa7b3ffa1db5465d6f1f20d9188d1ca1bfff9eb98224ceb37b4b"} Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.333517 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f00039848379fa7b3ffa1db5465d6f1f20d9188d1ca1bfff9eb98224ceb37b4b" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.333628 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9c84-account-create-update-jft6n" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.338848 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0faa-account-create-update-tdxld" event={"ID":"de5fe1e3-5ab4-40e9-a902-f1b444bd005f","Type":"ContainerDied","Data":"8bda8446d680947e9b42c3ea59423ab562bb02282b0dcbe2843ce4b1e21c1933"} Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.338927 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bda8446d680947e9b42c3ea59423ab562bb02282b0dcbe2843ce4b1e21c1933" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.339072 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0faa-account-create-update-tdxld" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.366025 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c97596dc4-sl4jb" event={"ID":"7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c","Type":"ContainerDied","Data":"a679d362fd3db591e83031b244e5789630aa80c5c03997d2f612f57bdd283585"} Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.366089 4878 scope.go:117] "RemoveContainer" containerID="455a408a46b74f146eb30a79daacd3d8f1b0cde121ab2d2608f920d4fa99b91b" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.366201 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c97596dc4-sl4jb" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.377375 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b1330f-94c6-4ea3-869c-02d6e52250ce-kube-api-access-5s5rz" (OuterVolumeSpecName: "kube-api-access-5s5rz") pod "c2b1330f-94c6-4ea3-869c-02d6e52250ce" (UID: "c2b1330f-94c6-4ea3-869c-02d6e52250ce"). InnerVolumeSpecName "kube-api-access-5s5rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.382674 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" event={"ID":"c2b1330f-94c6-4ea3-869c-02d6e52250ce","Type":"ContainerDied","Data":"898162f3640f482d8836006e0163ffa5a9ae1e6fd0c55d3c1b14ccc5ba437bfb"} Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.382808 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b86f6dd56-s4crw" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.391436 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.391483 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s5rz\" (UniqueName: \"kubernetes.io/projected/c2b1330f-94c6-4ea3-869c-02d6e52250ce-kube-api-access-5s5rz\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.391498 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.391508 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpg84\" (UniqueName: \"kubernetes.io/projected/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-kube-api-access-rpg84\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.396727 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" (UID: "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.404591 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jfrhp" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.404868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jfrhp" event={"ID":"ebe98b91-cced-4d28-b2cb-9e19d827a817","Type":"ContainerDied","Data":"12efed16e60a9b04a5923054741b7dbf61a6d4ff4cd17d94560863667f113b69"} Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.404920 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12efed16e60a9b04a5923054741b7dbf61a6d4ff4cd17d94560863667f113b69" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.417900 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2b1330f-94c6-4ea3-869c-02d6e52250ce" (UID: "c2b1330f-94c6-4ea3-869c-02d6e52250ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.489882 4878 scope.go:117] "RemoveContainer" containerID="c718648b6e6d53ca2bd5fe7efc82dcc5a22fe7a68a44e9784d6978fd72286397" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.498052 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.498082 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.514604 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data" (OuterVolumeSpecName: "config-data") pod "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" (UID: "7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.519371 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.529736 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data" (OuterVolumeSpecName: "config-data") pod "c2b1330f-94c6-4ea3-869c-02d6e52250ce" (UID: "c2b1330f-94c6-4ea3-869c-02d6e52250ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.603520 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.603551 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2b1330f-94c6-4ea3-869c-02d6e52250ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.622108 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:27 crc kubenswrapper[4878]: I1202 18:39:27.783800 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-489zf"] Dec 02 18:39:28 crc kubenswrapper[4878]: I1202 18:39:28.211570 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7c97596dc4-sl4jb"] Dec 02 18:39:28 crc kubenswrapper[4878]: I1202 18:39:28.264932 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7c97596dc4-sl4jb"] Dec 02 18:39:28 crc kubenswrapper[4878]: I1202 18:39:28.289964 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5b86f6dd56-s4crw"] Dec 02 18:39:28 crc kubenswrapper[4878]: I1202 18:39:28.305045 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5b86f6dd56-s4crw"] Dec 02 18:39:28 crc kubenswrapper[4878]: I1202 18:39:28.964128 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" path="/var/lib/kubelet/pods/7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c/volumes" Dec 02 18:39:28 crc kubenswrapper[4878]: I1202 18:39:28.965370 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" path="/var/lib/kubelet/pods/c2b1330f-94c6-4ea3-869c-02d6e52250ce/volumes" Dec 02 18:39:29 crc kubenswrapper[4878]: I1202 18:39:29.202795 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:39:29 crc kubenswrapper[4878]: I1202 18:39:29.269206 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-64fc868fd9-8kp5j"] Dec 02 18:39:29 crc kubenswrapper[4878]: I1202 18:39:29.269487 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-64fc868fd9-8kp5j" podUID="4e1991d0-7abb-495c-acb9-682829e20961" containerName="heat-engine" containerID="cri-o://d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" gracePeriod=60 Dec 02 18:39:29 crc kubenswrapper[4878]: I1202 18:39:29.429822 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-489zf" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="registry-server" containerID="cri-o://f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e" gracePeriod=2 Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.078989 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.242996 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-utilities\") pod \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.243113 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-catalog-content\") pod \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.249657 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8n8x\" (UniqueName: \"kubernetes.io/projected/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-kube-api-access-j8n8x\") pod \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\" (UID: \"158fd016-79a4-4bf1-90b3-20a4eae3d1d6\") " Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.258295 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-utilities" (OuterVolumeSpecName: "utilities") pod "158fd016-79a4-4bf1-90b3-20a4eae3d1d6" (UID: "158fd016-79a4-4bf1-90b3-20a4eae3d1d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.262600 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-kube-api-access-j8n8x" (OuterVolumeSpecName: "kube-api-access-j8n8x") pod "158fd016-79a4-4bf1-90b3-20a4eae3d1d6" (UID: "158fd016-79a4-4bf1-90b3-20a4eae3d1d6"). InnerVolumeSpecName "kube-api-access-j8n8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.327607 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "158fd016-79a4-4bf1-90b3-20a4eae3d1d6" (UID: "158fd016-79a4-4bf1-90b3-20a4eae3d1d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.353510 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.354392 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.354478 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8n8x\" (UniqueName: \"kubernetes.io/projected/158fd016-79a4-4bf1-90b3-20a4eae3d1d6-kube-api-access-j8n8x\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:30 crc kubenswrapper[4878]: E1202 18:39:30.423968 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:39:30 crc kubenswrapper[4878]: E1202 18:39:30.428343 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:39:30 crc kubenswrapper[4878]: E1202 18:39:30.430398 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:39:30 crc kubenswrapper[4878]: E1202 18:39:30.430490 4878 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-64fc868fd9-8kp5j" podUID="4e1991d0-7abb-495c-acb9-682829e20961" containerName="heat-engine" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.443893 4878 generic.go:334] "Generic (PLEG): container finished" podID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerID="f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e" exitCode=0 Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.443964 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerDied","Data":"f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e"} Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.444011 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-489zf" event={"ID":"158fd016-79a4-4bf1-90b3-20a4eae3d1d6","Type":"ContainerDied","Data":"ad42f95b714ca48e726cfa42f0e07d46275d091e5422358987885c87db39e592"} Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.444043 4878 scope.go:117] "RemoveContainer" containerID="f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.444351 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-489zf" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.494119 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-489zf"] Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.506787 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-489zf"] Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.511304 4878 scope.go:117] "RemoveContainer" containerID="1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.549444 4878 scope.go:117] "RemoveContainer" containerID="23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.614042 4878 scope.go:117] "RemoveContainer" containerID="f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e" Dec 02 18:39:30 crc kubenswrapper[4878]: E1202 18:39:30.614823 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e\": container with ID starting with f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e not found: ID does not exist" containerID="f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.614908 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e"} err="failed to get container status \"f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e\": rpc error: code = NotFound desc = could not find container \"f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e\": container with ID starting with f43d73f53136a7c0092d78ca668204f2f1daa710c3c36ecf85fcbf97d6d9553e not found: ID does not exist" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.614957 4878 scope.go:117] "RemoveContainer" containerID="1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36" Dec 02 18:39:30 crc kubenswrapper[4878]: E1202 18:39:30.615362 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36\": container with ID starting with 1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36 not found: ID does not exist" containerID="1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.615403 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36"} err="failed to get container status \"1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36\": rpc error: code = NotFound desc = could not find container \"1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36\": container with ID starting with 1f9a6bf938441ded066058f943340eace971a2f338075d35a97684edc348aa36 not found: ID does not exist" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.615428 4878 scope.go:117] "RemoveContainer" containerID="23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544" Dec 02 18:39:30 crc kubenswrapper[4878]: E1202 18:39:30.615703 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544\": container with ID starting with 23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544 not found: ID does not exist" containerID="23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.615735 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544"} err="failed to get container status \"23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544\": rpc error: code = NotFound desc = could not find container \"23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544\": container with ID starting with 23898fd2dd467b06e258aafe090f7a3be7fa0bfaae403643d4c3094a82c98544 not found: ID does not exist" Dec 02 18:39:30 crc kubenswrapper[4878]: I1202 18:39:30.958671 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" path="/var/lib/kubelet/pods/158fd016-79a4-4bf1-90b3-20a4eae3d1d6/volumes" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.143932 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l8xrz"] Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.144982 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe98b91-cced-4d28-b2cb-9e19d827a817" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145013 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe98b91-cced-4d28-b2cb-9e19d827a817" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145042 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="extract-content" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145053 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="extract-content" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145077 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerName="heat-cfnapi" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145085 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerName="heat-cfnapi" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145097 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b863f92e-dc47-4a8e-b9ae-31baafb9ec79" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145103 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b863f92e-dc47-4a8e-b9ae-31baafb9ec79" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145124 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="registry-server" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145132 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="registry-server" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145146 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5fe1e3-5ab4-40e9-a902-f1b444bd005f" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145154 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5fe1e3-5ab4-40e9-a902-f1b444bd005f" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145168 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerName="heat-api" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145177 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerName="heat-api" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145188 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a22bf8-3e2b-4bd7-bbda-2cf7301065f4" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145197 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a22bf8-3e2b-4bd7-bbda-2cf7301065f4" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145229 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145258 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145290 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fae1bb0-8b82-44a5-871d-45252562e8a7" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145298 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fae1bb0-8b82-44a5-871d-45252562e8a7" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: E1202 18:39:31.145315 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="extract-utilities" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145323 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="extract-utilities" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145624 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a22bf8-3e2b-4bd7-bbda-2cf7301065f4" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145642 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fae1bb0-8b82-44a5-871d-45252562e8a7" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145663 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerName="heat-api" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145674 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerName="heat-cfnapi" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145687 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerName="heat-cfnapi" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145703 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe98b91-cced-4d28-b2cb-9e19d827a817" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145722 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b" containerName="mariadb-database-create" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145733 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b863f92e-dc47-4a8e-b9ae-31baafb9ec79" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145748 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="158fd016-79a4-4bf1-90b3-20a4eae3d1d6" containerName="registry-server" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.145759 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5fe1e3-5ab4-40e9-a902-f1b444bd005f" containerName="mariadb-account-create-update" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.147121 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.150739 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.150863 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j6vzb" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.151437 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.156955 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l8xrz"] Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.282098 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.282163 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-scripts\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.282208 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-config-data\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.282775 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzfl\" (UniqueName: \"kubernetes.io/projected/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-kube-api-access-qwzfl\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.386367 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.386426 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-scripts\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.386465 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-config-data\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.386573 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzfl\" (UniqueName: \"kubernetes.io/projected/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-kube-api-access-qwzfl\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.395263 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-config-data\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.399840 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-scripts\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.400215 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.407150 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzfl\" (UniqueName: \"kubernetes.io/projected/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-kube-api-access-qwzfl\") pod \"nova-cell0-conductor-db-sync-l8xrz\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:31 crc kubenswrapper[4878]: I1202 18:39:31.470077 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:32 crc kubenswrapper[4878]: W1202 18:39:32.101353 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded70a9c4_6719_4a2f_a7b2_1e6cdfb370bd.slice/crio-e28d7c5f297f292ca91e71bb29bf0d5fb627071af5d945385b2d54e6f709eb53 WatchSource:0}: Error finding container e28d7c5f297f292ca91e71bb29bf0d5fb627071af5d945385b2d54e6f709eb53: Status 404 returned error can't find the container with id e28d7c5f297f292ca91e71bb29bf0d5fb627071af5d945385b2d54e6f709eb53 Dec 02 18:39:32 crc kubenswrapper[4878]: I1202 18:39:32.106147 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l8xrz"] Dec 02 18:39:32 crc kubenswrapper[4878]: I1202 18:39:32.514743 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" event={"ID":"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd","Type":"ContainerStarted","Data":"e28d7c5f297f292ca91e71bb29bf0d5fb627071af5d945385b2d54e6f709eb53"} Dec 02 18:39:37 crc kubenswrapper[4878]: I1202 18:39:37.605557 4878 generic.go:334] "Generic (PLEG): container finished" podID="4e1991d0-7abb-495c-acb9-682829e20961" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" exitCode=0 Dec 02 18:39:37 crc kubenswrapper[4878]: I1202 18:39:37.606136 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64fc868fd9-8kp5j" event={"ID":"4e1991d0-7abb-495c-acb9-682829e20961","Type":"ContainerDied","Data":"d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab"} Dec 02 18:39:40 crc kubenswrapper[4878]: E1202 18:39:40.421598 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab is running failed: container process not found" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:39:40 crc kubenswrapper[4878]: E1202 18:39:40.422693 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab is running failed: container process not found" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:39:40 crc kubenswrapper[4878]: E1202 18:39:40.424188 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab is running failed: container process not found" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:39:40 crc kubenswrapper[4878]: E1202 18:39:40.424228 4878 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-64fc868fd9-8kp5j" podUID="4e1991d0-7abb-495c-acb9-682829e20961" containerName="heat-engine" Dec 02 18:39:41 crc kubenswrapper[4878]: I1202 18:39:41.914386 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.023668 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data-custom\") pod \"4e1991d0-7abb-495c-acb9-682829e20961\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.024545 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-combined-ca-bundle\") pod \"4e1991d0-7abb-495c-acb9-682829e20961\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.024794 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn9rh\" (UniqueName: \"kubernetes.io/projected/4e1991d0-7abb-495c-acb9-682829e20961-kube-api-access-fn9rh\") pod \"4e1991d0-7abb-495c-acb9-682829e20961\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.024920 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data\") pod \"4e1991d0-7abb-495c-acb9-682829e20961\" (UID: \"4e1991d0-7abb-495c-acb9-682829e20961\") " Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.030451 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1991d0-7abb-495c-acb9-682829e20961-kube-api-access-fn9rh" (OuterVolumeSpecName: "kube-api-access-fn9rh") pod "4e1991d0-7abb-495c-acb9-682829e20961" (UID: "4e1991d0-7abb-495c-acb9-682829e20961"). InnerVolumeSpecName "kube-api-access-fn9rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.030636 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4e1991d0-7abb-495c-acb9-682829e20961" (UID: "4e1991d0-7abb-495c-acb9-682829e20961"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.058460 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e1991d0-7abb-495c-acb9-682829e20961" (UID: "4e1991d0-7abb-495c-acb9-682829e20961"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.113824 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data" (OuterVolumeSpecName: "config-data") pod "4e1991d0-7abb-495c-acb9-682829e20961" (UID: "4e1991d0-7abb-495c-acb9-682829e20961"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.129217 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.129273 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.129285 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1991d0-7abb-495c-acb9-682829e20961-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.129297 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn9rh\" (UniqueName: \"kubernetes.io/projected/4e1991d0-7abb-495c-acb9-682829e20961-kube-api-access-fn9rh\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.673392 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" event={"ID":"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd","Type":"ContainerStarted","Data":"187259752d082987f31dc2461a6ee01d1dc982a58df2030a3772cf41396018d2"} Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.676072 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-64fc868fd9-8kp5j" event={"ID":"4e1991d0-7abb-495c-acb9-682829e20961","Type":"ContainerDied","Data":"64a981433cdd0e5af987ac096bb28a8a33f54c20087b11294364f4d1e74cd125"} Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.676117 4878 scope.go:117] "RemoveContainer" containerID="d7cb3603bfaa2c396a9b5f8aa10e3b622c6faca9f6b9937c4704c4d334830cab" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.676134 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-64fc868fd9-8kp5j" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.712730 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" podStartSLOduration=2.070521515 podStartE2EDuration="11.71270593s" podCreationTimestamp="2025-12-02 18:39:31 +0000 UTC" firstStartedPulling="2025-12-02 18:39:32.106074296 +0000 UTC m=+1481.795693177" lastFinishedPulling="2025-12-02 18:39:41.748258711 +0000 UTC m=+1491.437877592" observedRunningTime="2025-12-02 18:39:42.700969371 +0000 UTC m=+1492.390588252" watchObservedRunningTime="2025-12-02 18:39:42.71270593 +0000 UTC m=+1492.402324811" Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.745834 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-64fc868fd9-8kp5j"] Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.763013 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-64fc868fd9-8kp5j"] Dec 02 18:39:42 crc kubenswrapper[4878]: I1202 18:39:42.980740 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1991d0-7abb-495c-acb9-682829e20961" path="/var/lib/kubelet/pods/4e1991d0-7abb-495c-acb9-682829e20961/volumes" Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.338928 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.339687 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-central-agent" containerID="cri-o://dbfd6fcb4461b04362a2ccb1c131969624cad4e3fb350c3e497f97f80ead7749" gracePeriod=30 Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.339929 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="proxy-httpd" containerID="cri-o://e216f8dfe39e9b1c63a014c29c0a83014f45d4563db5eb82702e3add216e379f" gracePeriod=30 Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.340009 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-notification-agent" containerID="cri-o://c032305040d4a4f71922dfdd6e4e49068a6713b5664bd177a5292ef58c36f944" gracePeriod=30 Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.340092 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="sg-core" containerID="cri-o://8927a329748c3e80e2c313cc2a89d3480ea0fa38674c8f9286fa7ab9d2a5e5a9" gracePeriod=30 Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.365339 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.706130 4878 generic.go:334] "Generic (PLEG): container finished" podID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerID="e216f8dfe39e9b1c63a014c29c0a83014f45d4563db5eb82702e3add216e379f" exitCode=0 Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.706169 4878 generic.go:334] "Generic (PLEG): container finished" podID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerID="8927a329748c3e80e2c313cc2a89d3480ea0fa38674c8f9286fa7ab9d2a5e5a9" exitCode=2 Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.706211 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerDied","Data":"e216f8dfe39e9b1c63a014c29c0a83014f45d4563db5eb82702e3add216e379f"} Dec 02 18:39:44 crc kubenswrapper[4878]: I1202 18:39:44.706282 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerDied","Data":"8927a329748c3e80e2c313cc2a89d3480ea0fa38674c8f9286fa7ab9d2a5e5a9"} Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.186060 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-5llm6"] Dec 02 18:39:45 crc kubenswrapper[4878]: E1202 18:39:45.187190 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerName="heat-cfnapi" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.187205 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b1330f-94c6-4ea3-869c-02d6e52250ce" containerName="heat-cfnapi" Dec 02 18:39:45 crc kubenswrapper[4878]: E1202 18:39:45.187259 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerName="heat-api" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.187271 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerName="heat-api" Dec 02 18:39:45 crc kubenswrapper[4878]: E1202 18:39:45.187335 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1991d0-7abb-495c-acb9-682829e20961" containerName="heat-engine" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.187346 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1991d0-7abb-495c-acb9-682829e20961" containerName="heat-engine" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.187616 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd4f40d-417d-44a7-9b2f-b16f9dd7a64c" containerName="heat-api" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.187640 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1991d0-7abb-495c-acb9-682829e20961" containerName="heat-engine" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.188768 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.200646 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-5llm6"] Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.290839 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-e087-account-create-update-26jph"] Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.292524 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.295718 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.320164 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-e087-account-create-update-26jph"] Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.338847 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcceb81c-7764-496d-8695-70e73d5a6ce9-operator-scripts\") pod \"aodh-db-create-5llm6\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.338958 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qm7\" (UniqueName: \"kubernetes.io/projected/bcceb81c-7764-496d-8695-70e73d5a6ce9-kube-api-access-r7qm7\") pod \"aodh-db-create-5llm6\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.440855 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9dk\" (UniqueName: \"kubernetes.io/projected/781a418d-080c-4487-b8df-9f33d7e2caa8-kube-api-access-hg9dk\") pod \"aodh-e087-account-create-update-26jph\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.441271 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcceb81c-7764-496d-8695-70e73d5a6ce9-operator-scripts\") pod \"aodh-db-create-5llm6\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.441456 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781a418d-080c-4487-b8df-9f33d7e2caa8-operator-scripts\") pod \"aodh-e087-account-create-update-26jph\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.441608 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qm7\" (UniqueName: \"kubernetes.io/projected/bcceb81c-7764-496d-8695-70e73d5a6ce9-kube-api-access-r7qm7\") pod \"aodh-db-create-5llm6\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.442373 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcceb81c-7764-496d-8695-70e73d5a6ce9-operator-scripts\") pod \"aodh-db-create-5llm6\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.463191 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qm7\" (UniqueName: \"kubernetes.io/projected/bcceb81c-7764-496d-8695-70e73d5a6ce9-kube-api-access-r7qm7\") pod \"aodh-db-create-5llm6\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.509167 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.543979 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781a418d-080c-4487-b8df-9f33d7e2caa8-operator-scripts\") pod \"aodh-e087-account-create-update-26jph\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.544450 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9dk\" (UniqueName: \"kubernetes.io/projected/781a418d-080c-4487-b8df-9f33d7e2caa8-kube-api-access-hg9dk\") pod \"aodh-e087-account-create-update-26jph\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.544764 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781a418d-080c-4487-b8df-9f33d7e2caa8-operator-scripts\") pod \"aodh-e087-account-create-update-26jph\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.565922 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9dk\" (UniqueName: \"kubernetes.io/projected/781a418d-080c-4487-b8df-9f33d7e2caa8-kube-api-access-hg9dk\") pod \"aodh-e087-account-create-update-26jph\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.618522 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.798593 4878 generic.go:334] "Generic (PLEG): container finished" podID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerID="dbfd6fcb4461b04362a2ccb1c131969624cad4e3fb350c3e497f97f80ead7749" exitCode=0 Dec 02 18:39:45 crc kubenswrapper[4878]: I1202 18:39:45.798799 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerDied","Data":"dbfd6fcb4461b04362a2ccb1c131969624cad4e3fb350c3e497f97f80ead7749"} Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.059639 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-5llm6"] Dec 02 18:39:46 crc kubenswrapper[4878]: W1202 18:39:46.237266 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod781a418d_080c_4487_b8df_9f33d7e2caa8.slice/crio-99351035a86a53c82a349f2f3c01d8676a5d181c5c32b5bbdfced940f3101bd3 WatchSource:0}: Error finding container 99351035a86a53c82a349f2f3c01d8676a5d181c5c32b5bbdfced940f3101bd3: Status 404 returned error can't find the container with id 99351035a86a53c82a349f2f3c01d8676a5d181c5c32b5bbdfced940f3101bd3 Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.239895 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-e087-account-create-update-26jph"] Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.824109 4878 generic.go:334] "Generic (PLEG): container finished" podID="bcceb81c-7764-496d-8695-70e73d5a6ce9" containerID="42d6183555157baf843267cc3d9359125e03c5ca496e039182e58bea08e3da02" exitCode=0 Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.824188 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-5llm6" event={"ID":"bcceb81c-7764-496d-8695-70e73d5a6ce9","Type":"ContainerDied","Data":"42d6183555157baf843267cc3d9359125e03c5ca496e039182e58bea08e3da02"} Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.824587 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-5llm6" event={"ID":"bcceb81c-7764-496d-8695-70e73d5a6ce9","Type":"ContainerStarted","Data":"726e26f973225e9b53cc659df7a17f581e14edd351ff0acd194320a99e49d07f"} Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.831086 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e087-account-create-update-26jph" event={"ID":"781a418d-080c-4487-b8df-9f33d7e2caa8","Type":"ContainerStarted","Data":"947f25a9149acfe163c64e8f3f3d5027fbf7f5140b7c4fd93eacf345ffdfb140"} Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.831205 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e087-account-create-update-26jph" event={"ID":"781a418d-080c-4487-b8df-9f33d7e2caa8","Type":"ContainerStarted","Data":"99351035a86a53c82a349f2f3c01d8676a5d181c5c32b5bbdfced940f3101bd3"} Dec 02 18:39:46 crc kubenswrapper[4878]: I1202 18:39:46.872028 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-e087-account-create-update-26jph" podStartSLOduration=1.8720029249999999 podStartE2EDuration="1.872002925s" podCreationTimestamp="2025-12-02 18:39:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:39:46.86995016 +0000 UTC m=+1496.559569041" watchObservedRunningTime="2025-12-02 18:39:46.872002925 +0000 UTC m=+1496.561621806" Dec 02 18:39:47 crc kubenswrapper[4878]: I1202 18:39:47.855604 4878 generic.go:334] "Generic (PLEG): container finished" podID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerID="c032305040d4a4f71922dfdd6e4e49068a6713b5664bd177a5292ef58c36f944" exitCode=0 Dec 02 18:39:47 crc kubenswrapper[4878]: I1202 18:39:47.855707 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerDied","Data":"c032305040d4a4f71922dfdd6e4e49068a6713b5664bd177a5292ef58c36f944"} Dec 02 18:39:47 crc kubenswrapper[4878]: I1202 18:39:47.861392 4878 generic.go:334] "Generic (PLEG): container finished" podID="781a418d-080c-4487-b8df-9f33d7e2caa8" containerID="947f25a9149acfe163c64e8f3f3d5027fbf7f5140b7c4fd93eacf345ffdfb140" exitCode=0 Dec 02 18:39:47 crc kubenswrapper[4878]: I1202 18:39:47.862198 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e087-account-create-update-26jph" event={"ID":"781a418d-080c-4487-b8df-9f33d7e2caa8","Type":"ContainerDied","Data":"947f25a9149acfe163c64e8f3f3d5027fbf7f5140b7c4fd93eacf345ffdfb140"} Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.281964 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.297783 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.437790 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-log-httpd\") pod \"58d7b8ef-c077-4e32-9345-3b402def9fce\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.437865 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-scripts\") pod \"58d7b8ef-c077-4e32-9345-3b402def9fce\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.437896 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5vzs\" (UniqueName: \"kubernetes.io/projected/58d7b8ef-c077-4e32-9345-3b402def9fce-kube-api-access-h5vzs\") pod \"58d7b8ef-c077-4e32-9345-3b402def9fce\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.438159 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-run-httpd\") pod \"58d7b8ef-c077-4e32-9345-3b402def9fce\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.438193 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-sg-core-conf-yaml\") pod \"58d7b8ef-c077-4e32-9345-3b402def9fce\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.438221 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-combined-ca-bundle\") pod \"58d7b8ef-c077-4e32-9345-3b402def9fce\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.438289 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7qm7\" (UniqueName: \"kubernetes.io/projected/bcceb81c-7764-496d-8695-70e73d5a6ce9-kube-api-access-r7qm7\") pod \"bcceb81c-7764-496d-8695-70e73d5a6ce9\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.438399 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-config-data\") pod \"58d7b8ef-c077-4e32-9345-3b402def9fce\" (UID: \"58d7b8ef-c077-4e32-9345-3b402def9fce\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.438423 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcceb81c-7764-496d-8695-70e73d5a6ce9-operator-scripts\") pod \"bcceb81c-7764-496d-8695-70e73d5a6ce9\" (UID: \"bcceb81c-7764-496d-8695-70e73d5a6ce9\") " Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.439300 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcceb81c-7764-496d-8695-70e73d5a6ce9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bcceb81c-7764-496d-8695-70e73d5a6ce9" (UID: "bcceb81c-7764-496d-8695-70e73d5a6ce9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.439326 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "58d7b8ef-c077-4e32-9345-3b402def9fce" (UID: "58d7b8ef-c077-4e32-9345-3b402def9fce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.439698 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "58d7b8ef-c077-4e32-9345-3b402def9fce" (UID: "58d7b8ef-c077-4e32-9345-3b402def9fce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.450486 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-scripts" (OuterVolumeSpecName: "scripts") pod "58d7b8ef-c077-4e32-9345-3b402def9fce" (UID: "58d7b8ef-c077-4e32-9345-3b402def9fce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.450520 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcceb81c-7764-496d-8695-70e73d5a6ce9-kube-api-access-r7qm7" (OuterVolumeSpecName: "kube-api-access-r7qm7") pod "bcceb81c-7764-496d-8695-70e73d5a6ce9" (UID: "bcceb81c-7764-496d-8695-70e73d5a6ce9"). InnerVolumeSpecName "kube-api-access-r7qm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.450767 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d7b8ef-c077-4e32-9345-3b402def9fce-kube-api-access-h5vzs" (OuterVolumeSpecName: "kube-api-access-h5vzs") pod "58d7b8ef-c077-4e32-9345-3b402def9fce" (UID: "58d7b8ef-c077-4e32-9345-3b402def9fce"). InnerVolumeSpecName "kube-api-access-h5vzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.478475 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "58d7b8ef-c077-4e32-9345-3b402def9fce" (UID: "58d7b8ef-c077-4e32-9345-3b402def9fce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.547078 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.547142 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.547165 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7qm7\" (UniqueName: \"kubernetes.io/projected/bcceb81c-7764-496d-8695-70e73d5a6ce9-kube-api-access-r7qm7\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.547184 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcceb81c-7764-496d-8695-70e73d5a6ce9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.547204 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58d7b8ef-c077-4e32-9345-3b402def9fce-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.547220 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.547260 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5vzs\" (UniqueName: \"kubernetes.io/projected/58d7b8ef-c077-4e32-9345-3b402def9fce-kube-api-access-h5vzs\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.558087 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58d7b8ef-c077-4e32-9345-3b402def9fce" (UID: "58d7b8ef-c077-4e32-9345-3b402def9fce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.583540 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-config-data" (OuterVolumeSpecName: "config-data") pod "58d7b8ef-c077-4e32-9345-3b402def9fce" (UID: "58d7b8ef-c077-4e32-9345-3b402def9fce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.649646 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.649687 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d7b8ef-c077-4e32-9345-3b402def9fce-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.877695 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58d7b8ef-c077-4e32-9345-3b402def9fce","Type":"ContainerDied","Data":"bb0b37d40e8358eb50cadee67443b1c388d1beea2cfc1cebe06e4ee4a8786429"} Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.877748 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.877775 4878 scope.go:117] "RemoveContainer" containerID="e216f8dfe39e9b1c63a014c29c0a83014f45d4563db5eb82702e3add216e379f" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.880920 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-5llm6" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.882458 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-5llm6" event={"ID":"bcceb81c-7764-496d-8695-70e73d5a6ce9","Type":"ContainerDied","Data":"726e26f973225e9b53cc659df7a17f581e14edd351ff0acd194320a99e49d07f"} Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.882531 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726e26f973225e9b53cc659df7a17f581e14edd351ff0acd194320a99e49d07f" Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.928924 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:48 crc kubenswrapper[4878]: I1202 18:39:48.934417 4878 scope.go:117] "RemoveContainer" containerID="8927a329748c3e80e2c313cc2a89d3480ea0fa38674c8f9286fa7ab9d2a5e5a9" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.020547 4878 scope.go:117] "RemoveContainer" containerID="c032305040d4a4f71922dfdd6e4e49068a6713b5664bd177a5292ef58c36f944" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.022671 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.022730 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:49 crc kubenswrapper[4878]: E1202 18:39:49.023304 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="sg-core" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023322 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="sg-core" Dec 02 18:39:49 crc kubenswrapper[4878]: E1202 18:39:49.023520 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-notification-agent" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023534 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-notification-agent" Dec 02 18:39:49 crc kubenswrapper[4878]: E1202 18:39:49.023549 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcceb81c-7764-496d-8695-70e73d5a6ce9" containerName="mariadb-database-create" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023556 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcceb81c-7764-496d-8695-70e73d5a6ce9" containerName="mariadb-database-create" Dec 02 18:39:49 crc kubenswrapper[4878]: E1202 18:39:49.023575 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="proxy-httpd" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023586 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="proxy-httpd" Dec 02 18:39:49 crc kubenswrapper[4878]: E1202 18:39:49.023629 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-central-agent" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023638 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-central-agent" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023918 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="proxy-httpd" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023938 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcceb81c-7764-496d-8695-70e73d5a6ce9" containerName="mariadb-database-create" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.023984 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-central-agent" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.024005 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="sg-core" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.024027 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" containerName="ceilometer-notification-agent" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.027343 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.027614 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.031442 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.031542 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.073427 4878 scope.go:117] "RemoveContainer" containerID="dbfd6fcb4461b04362a2ccb1c131969624cad4e3fb350c3e497f97f80ead7749" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.177732 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2rnz\" (UniqueName: \"kubernetes.io/projected/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-kube-api-access-r2rnz\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.177797 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.177833 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-config-data\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.177863 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-log-httpd\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.177900 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.177936 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-run-httpd\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.177995 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-scripts\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.285747 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-run-httpd\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.286034 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-scripts\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.286131 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2rnz\" (UniqueName: \"kubernetes.io/projected/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-kube-api-access-r2rnz\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.286174 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.286208 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-config-data\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.286267 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-log-httpd\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.286311 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.296494 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-log-httpd\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.296573 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-run-httpd\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.306059 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2rnz\" (UniqueName: \"kubernetes.io/projected/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-kube-api-access-r2rnz\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.306664 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.306766 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-scripts\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.309933 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.311284 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-config-data\") pod \"ceilometer-0\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.359008 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.520313 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.711614 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781a418d-080c-4487-b8df-9f33d7e2caa8-operator-scripts\") pod \"781a418d-080c-4487-b8df-9f33d7e2caa8\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.712091 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg9dk\" (UniqueName: \"kubernetes.io/projected/781a418d-080c-4487-b8df-9f33d7e2caa8-kube-api-access-hg9dk\") pod \"781a418d-080c-4487-b8df-9f33d7e2caa8\" (UID: \"781a418d-080c-4487-b8df-9f33d7e2caa8\") " Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.712696 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/781a418d-080c-4487-b8df-9f33d7e2caa8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "781a418d-080c-4487-b8df-9f33d7e2caa8" (UID: "781a418d-080c-4487-b8df-9f33d7e2caa8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.721504 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781a418d-080c-4487-b8df-9f33d7e2caa8-kube-api-access-hg9dk" (OuterVolumeSpecName: "kube-api-access-hg9dk") pod "781a418d-080c-4487-b8df-9f33d7e2caa8" (UID: "781a418d-080c-4487-b8df-9f33d7e2caa8"). InnerVolumeSpecName "kube-api-access-hg9dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.816591 4878 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/781a418d-080c-4487-b8df-9f33d7e2caa8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.816637 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg9dk\" (UniqueName: \"kubernetes.io/projected/781a418d-080c-4487-b8df-9f33d7e2caa8-kube-api-access-hg9dk\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.911777 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e087-account-create-update-26jph" event={"ID":"781a418d-080c-4487-b8df-9f33d7e2caa8","Type":"ContainerDied","Data":"99351035a86a53c82a349f2f3c01d8676a5d181c5c32b5bbdfced940f3101bd3"} Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.911821 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e087-account-create-update-26jph" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.911824 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99351035a86a53c82a349f2f3c01d8676a5d181c5c32b5bbdfced940f3101bd3" Dec 02 18:39:49 crc kubenswrapper[4878]: I1202 18:39:49.965992 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.570383 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-dfqz2"] Dec 02 18:39:50 crc kubenswrapper[4878]: E1202 18:39:50.571171 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781a418d-080c-4487-b8df-9f33d7e2caa8" containerName="mariadb-account-create-update" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.571192 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="781a418d-080c-4487-b8df-9f33d7e2caa8" containerName="mariadb-account-create-update" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.571516 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="781a418d-080c-4487-b8df-9f33d7e2caa8" containerName="mariadb-account-create-update" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.574299 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.581344 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f79q6" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.581413 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.581513 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.581753 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.594214 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-dfqz2"] Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.636713 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-config-data\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.637314 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-combined-ca-bundle\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.637457 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-scripts\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.637570 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg2r9\" (UniqueName: \"kubernetes.io/projected/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-kube-api-access-wg2r9\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.739038 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-combined-ca-bundle\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.739101 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-scripts\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.739152 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg2r9\" (UniqueName: \"kubernetes.io/projected/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-kube-api-access-wg2r9\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.739319 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-config-data\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.746167 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-scripts\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.747155 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-combined-ca-bundle\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.747931 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-config-data\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.766087 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg2r9\" (UniqueName: \"kubernetes.io/projected/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-kube-api-access-wg2r9\") pod \"aodh-db-sync-dfqz2\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.906048 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f79q6" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.914814 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:39:50 crc kubenswrapper[4878]: I1202 18:39:50.926275 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerStarted","Data":"95f53b78e506c32b44282ca175883209aedbe0a4f04fd85a6b3f8de3911540b8"} Dec 02 18:39:51 crc kubenswrapper[4878]: I1202 18:39:51.006747 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d7b8ef-c077-4e32-9345-3b402def9fce" path="/var/lib/kubelet/pods/58d7b8ef-c077-4e32-9345-3b402def9fce/volumes" Dec 02 18:39:51 crc kubenswrapper[4878]: I1202 18:39:51.521877 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-dfqz2"] Dec 02 18:39:51 crc kubenswrapper[4878]: I1202 18:39:51.939193 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dfqz2" event={"ID":"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff","Type":"ContainerStarted","Data":"2a9fb6a1e0494f6eb71e10a9d481925a5b2b171d28a0f7221ad1e2dc15a5df7a"} Dec 02 18:39:51 crc kubenswrapper[4878]: I1202 18:39:51.940973 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerStarted","Data":"106302f7bf29364375d8a0797c569137b44908aebecd14827d3d15ab5f762391"} Dec 02 18:39:51 crc kubenswrapper[4878]: I1202 18:39:51.941021 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerStarted","Data":"a0491fee9a6ebce2288eb3b26e3f45bae1b1c0d466575cf3f6ff734a15620cb5"} Dec 02 18:39:52 crc kubenswrapper[4878]: I1202 18:39:52.960482 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerStarted","Data":"953bf45cca88eb554756200cbac15a3697e217fb23f916b13a9f08c6d0642291"} Dec 02 18:39:56 crc kubenswrapper[4878]: I1202 18:39:56.001894 4878 generic.go:334] "Generic (PLEG): container finished" podID="ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" containerID="187259752d082987f31dc2461a6ee01d1dc982a58df2030a3772cf41396018d2" exitCode=0 Dec 02 18:39:56 crc kubenswrapper[4878]: I1202 18:39:56.001992 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" event={"ID":"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd","Type":"ContainerDied","Data":"187259752d082987f31dc2461a6ee01d1dc982a58df2030a3772cf41396018d2"} Dec 02 18:39:56 crc kubenswrapper[4878]: I1202 18:39:56.253788 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.017772 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerStarted","Data":"911e632cfb7f741304ab68ad63b33cc66ce688ea853af4754667e1002526973d"} Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.019368 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.020493 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dfqz2" event={"ID":"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff","Type":"ContainerStarted","Data":"fee7ab1681302243f395cd41adf57ea38404f7fed28ee9617864ad77e3739d16"} Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.052441 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.767951181 podStartE2EDuration="9.052419431s" podCreationTimestamp="2025-12-02 18:39:48 +0000 UTC" firstStartedPulling="2025-12-02 18:39:49.966158522 +0000 UTC m=+1499.655777403" lastFinishedPulling="2025-12-02 18:39:56.250626762 +0000 UTC m=+1505.940245653" observedRunningTime="2025-12-02 18:39:57.043202751 +0000 UTC m=+1506.732821632" watchObservedRunningTime="2025-12-02 18:39:57.052419431 +0000 UTC m=+1506.742038312" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.084054 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-dfqz2" podStartSLOduration=2.392153223 podStartE2EDuration="7.084033362s" podCreationTimestamp="2025-12-02 18:39:50 +0000 UTC" firstStartedPulling="2025-12-02 18:39:51.558671541 +0000 UTC m=+1501.248290422" lastFinishedPulling="2025-12-02 18:39:56.25055167 +0000 UTC m=+1505.940170561" observedRunningTime="2025-12-02 18:39:57.074592366 +0000 UTC m=+1506.764211247" watchObservedRunningTime="2025-12-02 18:39:57.084033362 +0000 UTC m=+1506.773652243" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.595966 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.749894 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-config-data\") pod \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.750308 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwzfl\" (UniqueName: \"kubernetes.io/projected/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-kube-api-access-qwzfl\") pod \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.750492 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-scripts\") pod \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.750640 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-combined-ca-bundle\") pod \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\" (UID: \"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd\") " Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.758909 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-kube-api-access-qwzfl" (OuterVolumeSpecName: "kube-api-access-qwzfl") pod "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" (UID: "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd"). InnerVolumeSpecName "kube-api-access-qwzfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.772710 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-scripts" (OuterVolumeSpecName: "scripts") pod "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" (UID: "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.805789 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" (UID: "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.810698 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-config-data" (OuterVolumeSpecName: "config-data") pod "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" (UID: "ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.854042 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.854078 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwzfl\" (UniqueName: \"kubernetes.io/projected/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-kube-api-access-qwzfl\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.854090 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:57 crc kubenswrapper[4878]: I1202 18:39:57.854098 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.033031 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" event={"ID":"ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd","Type":"ContainerDied","Data":"e28d7c5f297f292ca91e71bb29bf0d5fb627071af5d945385b2d54e6f709eb53"} Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.033400 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28d7c5f297f292ca91e71bb29bf0d5fb627071af5d945385b2d54e6f709eb53" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.033130 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-l8xrz" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.288001 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 18:39:58 crc kubenswrapper[4878]: E1202 18:39:58.288525 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" containerName="nova-cell0-conductor-db-sync" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.288541 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" containerName="nova-cell0-conductor-db-sync" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.288778 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" containerName="nova-cell0-conductor-db-sync" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.289654 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.292468 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.295345 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j6vzb" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.309811 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.366338 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5108475-8a24-4bce-a285-f4a26785d6f9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.366396 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn8gx\" (UniqueName: \"kubernetes.io/projected/b5108475-8a24-4bce-a285-f4a26785d6f9-kube-api-access-sn8gx\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.366506 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5108475-8a24-4bce-a285-f4a26785d6f9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.468676 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5108475-8a24-4bce-a285-f4a26785d6f9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.468733 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8gx\" (UniqueName: \"kubernetes.io/projected/b5108475-8a24-4bce-a285-f4a26785d6f9-kube-api-access-sn8gx\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.468824 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5108475-8a24-4bce-a285-f4a26785d6f9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.473463 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5108475-8a24-4bce-a285-f4a26785d6f9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.473630 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5108475-8a24-4bce-a285-f4a26785d6f9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.489289 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn8gx\" (UniqueName: \"kubernetes.io/projected/b5108475-8a24-4bce-a285-f4a26785d6f9-kube-api-access-sn8gx\") pod \"nova-cell0-conductor-0\" (UID: \"b5108475-8a24-4bce-a285-f4a26785d6f9\") " pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:58 crc kubenswrapper[4878]: I1202 18:39:58.616508 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 18:39:59 crc kubenswrapper[4878]: I1202 18:39:59.225633 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 18:40:00 crc kubenswrapper[4878]: I1202 18:40:00.083715 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b5108475-8a24-4bce-a285-f4a26785d6f9","Type":"ContainerStarted","Data":"814f8bfd655b7609c680c7244ca181ee66a925c3c3c6fcc693576c9c46e918ea"} Dec 02 18:40:00 crc kubenswrapper[4878]: I1202 18:40:00.083818 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b5108475-8a24-4bce-a285-f4a26785d6f9","Type":"ContainerStarted","Data":"04e881dbda5052ac73a04b3ae0b17b770bb7ccc147a3e422e4fdf6aa9d30619b"} Dec 02 18:40:00 crc kubenswrapper[4878]: I1202 18:40:00.083906 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 18:40:00 crc kubenswrapper[4878]: I1202 18:40:00.085202 4878 generic.go:334] "Generic (PLEG): container finished" podID="7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" containerID="fee7ab1681302243f395cd41adf57ea38404f7fed28ee9617864ad77e3739d16" exitCode=0 Dec 02 18:40:00 crc kubenswrapper[4878]: I1202 18:40:00.085257 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dfqz2" event={"ID":"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff","Type":"ContainerDied","Data":"fee7ab1681302243f395cd41adf57ea38404f7fed28ee9617864ad77e3739d16"} Dec 02 18:40:00 crc kubenswrapper[4878]: I1202 18:40:00.123169 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.123144393 podStartE2EDuration="2.123144393s" podCreationTimestamp="2025-12-02 18:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:00.104672314 +0000 UTC m=+1509.794291235" watchObservedRunningTime="2025-12-02 18:40:00.123144393 +0000 UTC m=+1509.812763284" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.574120 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.706132 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-scripts\") pod \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.706811 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-combined-ca-bundle\") pod \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.706958 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-config-data\") pod \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.707213 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg2r9\" (UniqueName: \"kubernetes.io/projected/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-kube-api-access-wg2r9\") pod \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\" (UID: \"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff\") " Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.713851 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-kube-api-access-wg2r9" (OuterVolumeSpecName: "kube-api-access-wg2r9") pod "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" (UID: "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff"). InnerVolumeSpecName "kube-api-access-wg2r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.714578 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-scripts" (OuterVolumeSpecName: "scripts") pod "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" (UID: "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.742423 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" (UID: "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.745589 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-config-data" (OuterVolumeSpecName: "config-data") pod "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" (UID: "7e54a432-26a0-46eb-b4b2-8f9cc141f4ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.810226 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.810540 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.810628 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:01 crc kubenswrapper[4878]: I1202 18:40:01.810709 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg2r9\" (UniqueName: \"kubernetes.io/projected/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff-kube-api-access-wg2r9\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:02 crc kubenswrapper[4878]: I1202 18:40:02.118952 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dfqz2" event={"ID":"7e54a432-26a0-46eb-b4b2-8f9cc141f4ff","Type":"ContainerDied","Data":"2a9fb6a1e0494f6eb71e10a9d481925a5b2b171d28a0f7221ad1e2dc15a5df7a"} Dec 02 18:40:02 crc kubenswrapper[4878]: I1202 18:40:02.119006 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9fb6a1e0494f6eb71e10a9d481925a5b2b171d28a0f7221ad1e2dc15a5df7a" Dec 02 18:40:02 crc kubenswrapper[4878]: I1202 18:40:02.119106 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dfqz2" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.754051 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:05 crc kubenswrapper[4878]: E1202 18:40:05.755204 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" containerName="aodh-db-sync" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.755220 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" containerName="aodh-db-sync" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.755525 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" containerName="aodh-db-sync" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.757762 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.763117 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f79q6" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.763435 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.763650 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.780754 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.944826 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-scripts\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.944865 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.944972 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-config-data\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:05 crc kubenswrapper[4878]: I1202 18:40:05.944997 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpfgb\" (UniqueName: \"kubernetes.io/projected/03c74212-8bb7-45e3-8110-cf65a7288caf-kube-api-access-kpfgb\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.047788 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpfgb\" (UniqueName: \"kubernetes.io/projected/03c74212-8bb7-45e3-8110-cf65a7288caf-kube-api-access-kpfgb\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.048662 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-scripts\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.048766 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.048998 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-config-data\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.181362 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.192252 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-config-data\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.192642 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-scripts\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.198863 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpfgb\" (UniqueName: \"kubernetes.io/projected/03c74212-8bb7-45e3-8110-cf65a7288caf-kube-api-access-kpfgb\") pod \"aodh-0\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " pod="openstack/aodh-0" Dec 02 18:40:06 crc kubenswrapper[4878]: I1202 18:40:06.402123 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:40:07 crc kubenswrapper[4878]: I1202 18:40:06.994414 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:07 crc kubenswrapper[4878]: I1202 18:40:07.210568 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerStarted","Data":"4bf935943d30c1cd0f803f72341ef44c92717eec2fa8c25c11008a2fc171bab3"} Dec 02 18:40:08 crc kubenswrapper[4878]: I1202 18:40:08.231853 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerStarted","Data":"93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601"} Dec 02 18:40:08 crc kubenswrapper[4878]: I1202 18:40:08.680045 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.599327 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rj842"] Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.601492 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.604906 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.607040 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.615839 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rj842"] Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.724029 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbfj\" (UniqueName: \"kubernetes.io/projected/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-kube-api-access-xrbfj\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.724099 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-scripts\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.724136 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.724592 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-config-data\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.758380 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.760611 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.770081 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.773682 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.828218 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-config-data\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.828454 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbfj\" (UniqueName: \"kubernetes.io/projected/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-kube-api-access-xrbfj\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.828492 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-scripts\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.828520 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.836277 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.842447 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.843200 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.843577 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-scripts\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.850895 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.851729 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-config-data\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.887511 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbfj\" (UniqueName: \"kubernetes.io/projected/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-kube-api-access-xrbfj\") pod \"nova-cell0-cell-mapping-rj842\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.907365 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.938345 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfzf7\" (UniqueName: \"kubernetes.io/projected/da60f101-6061-4a42-9333-0c06d5f0e9b1-kube-api-access-nfzf7\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.938397 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.938505 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.947625 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.989143 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:09 crc kubenswrapper[4878]: I1202 18:40:09.990978 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:09.999697 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.042112 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.043173 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.043426 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.043468 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-config-data\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.043520 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a44ce8-08a9-4450-b78f-6fd4b53b090a-logs\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.043543 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnk88\" (UniqueName: \"kubernetes.io/projected/94a44ce8-08a9-4450-b78f-6fd4b53b090a-kube-api-access-xnk88\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.043713 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfzf7\" (UniqueName: \"kubernetes.io/projected/da60f101-6061-4a42-9333-0c06d5f0e9b1-kube-api-access-nfzf7\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.043788 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.049587 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.087566 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.088566 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.128306 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfzf7\" (UniqueName: \"kubernetes.io/projected/da60f101-6061-4a42-9333-0c06d5f0e9b1-kube-api-access-nfzf7\") pod \"nova-cell1-novncproxy-0\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.149402 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.149462 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-config-data\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.149491 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a44ce8-08a9-4450-b78f-6fd4b53b090a-logs\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.149517 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnk88\" (UniqueName: \"kubernetes.io/projected/94a44ce8-08a9-4450-b78f-6fd4b53b090a-kube-api-access-xnk88\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.149630 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.149654 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-config-data\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.149733 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qgnf\" (UniqueName: \"kubernetes.io/projected/cbbf6880-1398-4476-b0d0-d340f7231645-kube-api-access-2qgnf\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.154933 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a44ce8-08a9-4450-b78f-6fd4b53b090a-logs\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.196676 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.199096 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-config-data\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.216792 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnk88\" (UniqueName: \"kubernetes.io/projected/94a44ce8-08a9-4450-b78f-6fd4b53b090a-kube-api-access-xnk88\") pod \"nova-metadata-0\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.222097 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.224465 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.229767 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.252631 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qgnf\" (UniqueName: \"kubernetes.io/projected/cbbf6880-1398-4476-b0d0-d340f7231645-kube-api-access-2qgnf\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.252846 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.252873 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-config-data\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.267599 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.268663 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-config-data\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.273965 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758d7bc895-6l69g"] Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.274939 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qgnf\" (UniqueName: \"kubernetes.io/projected/cbbf6880-1398-4476-b0d0-d340f7231645-kube-api-access-2qgnf\") pod \"nova-scheduler-0\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.276218 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.296318 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.310688 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758d7bc895-6l69g"] Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.343622 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.350982 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.355303 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkzx2\" (UniqueName: \"kubernetes.io/projected/6305987e-e2c7-4f75-b080-9bed005f003f-kube-api-access-pkzx2\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.355381 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.355413 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-swift-storage-0\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.355697 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-config\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.355921 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/369791de-d9df-4032-a66f-859fae1cbb28-kube-api-access-n2k7v\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.355991 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-config-data\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.356049 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-sb\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.356157 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-svc\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.356300 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-nb\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.356419 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369791de-d9df-4032-a66f-859fae1cbb28-logs\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.403780 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459204 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-nb\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459404 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369791de-d9df-4032-a66f-859fae1cbb28-logs\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459489 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkzx2\" (UniqueName: \"kubernetes.io/projected/6305987e-e2c7-4f75-b080-9bed005f003f-kube-api-access-pkzx2\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459593 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459647 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-swift-storage-0\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459700 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-config\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459785 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/369791de-d9df-4032-a66f-859fae1cbb28-kube-api-access-n2k7v\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459841 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-config-data\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459897 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-sb\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.459949 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-svc\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.461397 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-svc\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.461491 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-config\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.462012 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369791de-d9df-4032-a66f-859fae1cbb28-logs\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.462133 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-sb\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.464488 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-config-data\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.465531 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-nb\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.465576 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-swift-storage-0\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.474001 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.488017 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkzx2\" (UniqueName: \"kubernetes.io/projected/6305987e-e2c7-4f75-b080-9bed005f003f-kube-api-access-pkzx2\") pod \"dnsmasq-dns-758d7bc895-6l69g\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.501971 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/369791de-d9df-4032-a66f-859fae1cbb28-kube-api-access-n2k7v\") pod \"nova-api-0\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.602436 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:40:10 crc kubenswrapper[4878]: I1202 18:40:10.645068 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:11 crc kubenswrapper[4878]: I1202 18:40:11.714785 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rj842"] Dec 02 18:40:11 crc kubenswrapper[4878]: I1202 18:40:11.750882 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.011389 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.297785 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.298492 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-central-agent" containerID="cri-o://a0491fee9a6ebce2288eb3b26e3f45bae1b1c0d466575cf3f6ff734a15620cb5" gracePeriod=30 Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.299456 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="proxy-httpd" containerID="cri-o://911e632cfb7f741304ab68ad63b33cc66ce688ea853af4754667e1002526973d" gracePeriod=30 Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.299612 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="sg-core" containerID="cri-o://953bf45cca88eb554756200cbac15a3697e217fb23f916b13a9f08c6d0642291" gracePeriod=30 Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.299702 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-notification-agent" containerID="cri-o://106302f7bf29364375d8a0797c569137b44908aebecd14827d3d15ab5f762391" gracePeriod=30 Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.324150 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.386783 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rj842" event={"ID":"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a","Type":"ContainerStarted","Data":"c71c2f290ea3c39e21bca870922144c336e0f472d32c341c42049e2f5691e7ec"} Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.387268 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rj842" event={"ID":"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a","Type":"ContainerStarted","Data":"60f831f0fefddcec79ab945768e24c075b29fdcfc46955708cb8207d8b8d75ef"} Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.420146 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758d7bc895-6l69g"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.432134 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerStarted","Data":"53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98"} Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.457141 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.457191 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"da60f101-6061-4a42-9333-0c06d5f0e9b1","Type":"ContainerStarted","Data":"7f5c90cdc211acae02247089c4afd439a65f3b37ea0d00d9ca5a1e7df4971dca"} Dec 02 18:40:12 crc kubenswrapper[4878]: W1202 18:40:12.481765 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbbf6880_1398_4476_b0d0_d340f7231645.slice/crio-ac49eb0a234473802caf6e51b08afe97e4d5ed5661330fb91096dab1e57fddc5 WatchSource:0}: Error finding container ac49eb0a234473802caf6e51b08afe97e4d5ed5661330fb91096dab1e57fddc5: Status 404 returned error can't find the container with id ac49eb0a234473802caf6e51b08afe97e4d5ed5661330fb91096dab1e57fddc5 Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.482119 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"369791de-d9df-4032-a66f-859fae1cbb28","Type":"ContainerStarted","Data":"b657bca485decce35adb659e949a388b4083085ff02b140d573ea570ecc23648"} Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.486471 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rj842" podStartSLOduration=3.486345402 podStartE2EDuration="3.486345402s" podCreationTimestamp="2025-12-02 18:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:12.435169636 +0000 UTC m=+1522.124788507" watchObservedRunningTime="2025-12-02 18:40:12.486345402 +0000 UTC m=+1522.175964283" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.559735 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.749076 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w76q8"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.751209 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.758872 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.759365 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.784957 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w76q8"] Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.925121 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-scripts\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.925313 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.925402 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-config-data\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:12 crc kubenswrapper[4878]: I1202 18:40:12.925438 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmrch\" (UniqueName: \"kubernetes.io/projected/dda2e4cb-b62a-40c0-a3d8-5a427b609472-kube-api-access-rmrch\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.027521 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.027625 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-config-data\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.027660 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmrch\" (UniqueName: \"kubernetes.io/projected/dda2e4cb-b62a-40c0-a3d8-5a427b609472-kube-api-access-rmrch\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.027749 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-scripts\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.036403 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.052787 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-scripts\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.054399 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-config-data\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.066080 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmrch\" (UniqueName: \"kubernetes.io/projected/dda2e4cb-b62a-40c0-a3d8-5a427b609472-kube-api-access-rmrch\") pod \"nova-cell1-conductor-db-sync-w76q8\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.086422 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.526611 4878 generic.go:334] "Generic (PLEG): container finished" podID="6305987e-e2c7-4f75-b080-9bed005f003f" containerID="eb5e420f5ed70b768be40c7798b2dc4ad33736bacd88cc8069b44b2ee10f35cd" exitCode=0 Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.527013 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" event={"ID":"6305987e-e2c7-4f75-b080-9bed005f003f","Type":"ContainerDied","Data":"eb5e420f5ed70b768be40c7798b2dc4ad33736bacd88cc8069b44b2ee10f35cd"} Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.527047 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" event={"ID":"6305987e-e2c7-4f75-b080-9bed005f003f","Type":"ContainerStarted","Data":"0859d0b5fabd8d35fbaa8ec14e1db0448c3881ce8f9d8b410dd0de448ef9d6f1"} Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.583185 4878 generic.go:334] "Generic (PLEG): container finished" podID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerID="911e632cfb7f741304ab68ad63b33cc66ce688ea853af4754667e1002526973d" exitCode=0 Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.583731 4878 generic.go:334] "Generic (PLEG): container finished" podID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerID="953bf45cca88eb554756200cbac15a3697e217fb23f916b13a9f08c6d0642291" exitCode=2 Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.583995 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerDied","Data":"911e632cfb7f741304ab68ad63b33cc66ce688ea853af4754667e1002526973d"} Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.584511 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerDied","Data":"953bf45cca88eb554756200cbac15a3697e217fb23f916b13a9f08c6d0642291"} Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.595519 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbbf6880-1398-4476-b0d0-d340f7231645","Type":"ContainerStarted","Data":"ac49eb0a234473802caf6e51b08afe97e4d5ed5661330fb91096dab1e57fddc5"} Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.600082 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94a44ce8-08a9-4450-b78f-6fd4b53b090a","Type":"ContainerStarted","Data":"70243cc4246a4e3080f1363be2221379e0e559ad06fc47517b2771482ea0fa35"} Dec 02 18:40:13 crc kubenswrapper[4878]: I1202 18:40:13.937980 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w76q8"] Dec 02 18:40:14 crc kubenswrapper[4878]: I1202 18:40:14.614376 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:14 crc kubenswrapper[4878]: I1202 18:40:14.631928 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w76q8" event={"ID":"dda2e4cb-b62a-40c0-a3d8-5a427b609472","Type":"ContainerStarted","Data":"9b5a8a7e2fd581e2c6f7082289e64b5fdc9d48505b6ef1a828d6d42788b56404"} Dec 02 18:40:14 crc kubenswrapper[4878]: I1202 18:40:14.631986 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w76q8" event={"ID":"dda2e4cb-b62a-40c0-a3d8-5a427b609472","Type":"ContainerStarted","Data":"d89510e39bdaa3cf02e0a4f7487f58ec7e1a7e88d1a36be02f37e76d26ceffcc"} Dec 02 18:40:14 crc kubenswrapper[4878]: I1202 18:40:14.637950 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:14 crc kubenswrapper[4878]: I1202 18:40:14.660562 4878 generic.go:334] "Generic (PLEG): container finished" podID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerID="a0491fee9a6ebce2288eb3b26e3f45bae1b1c0d466575cf3f6ff734a15620cb5" exitCode=0 Dec 02 18:40:14 crc kubenswrapper[4878]: I1202 18:40:14.660615 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerDied","Data":"a0491fee9a6ebce2288eb3b26e3f45bae1b1c0d466575cf3f6ff734a15620cb5"} Dec 02 18:40:14 crc kubenswrapper[4878]: I1202 18:40:14.662970 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-w76q8" podStartSLOduration=2.66294697 podStartE2EDuration="2.66294697s" podCreationTimestamp="2025-12-02 18:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:14.655884019 +0000 UTC m=+1524.345502900" watchObservedRunningTime="2025-12-02 18:40:14.66294697 +0000 UTC m=+1524.352565851" Dec 02 18:40:15 crc kubenswrapper[4878]: I1202 18:40:15.676260 4878 generic.go:334] "Generic (PLEG): container finished" podID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerID="106302f7bf29364375d8a0797c569137b44908aebecd14827d3d15ab5f762391" exitCode=0 Dec 02 18:40:15 crc kubenswrapper[4878]: I1202 18:40:15.676342 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerDied","Data":"106302f7bf29364375d8a0797c569137b44908aebecd14827d3d15ab5f762391"} Dec 02 18:40:15 crc kubenswrapper[4878]: I1202 18:40:15.682439 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" event={"ID":"6305987e-e2c7-4f75-b080-9bed005f003f","Type":"ContainerStarted","Data":"0932c8c715307b869e16674e1569432e6dedd6e9a2f8ed169b4f99ff68b55052"} Dec 02 18:40:15 crc kubenswrapper[4878]: I1202 18:40:15.705123 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" podStartSLOduration=5.705094957 podStartE2EDuration="5.705094957s" podCreationTimestamp="2025-12-02 18:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:15.699413038 +0000 UTC m=+1525.389031919" watchObservedRunningTime="2025-12-02 18:40:15.705094957 +0000 UTC m=+1525.394713838" Dec 02 18:40:16 crc kubenswrapper[4878]: I1202 18:40:16.696334 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:18 crc kubenswrapper[4878]: I1202 18:40:18.725292 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81b2e402-c44e-414b-b0c5-fe49c0aaa75e","Type":"ContainerDied","Data":"95f53b78e506c32b44282ca175883209aedbe0a4f04fd85a6b3f8de3911540b8"} Dec 02 18:40:18 crc kubenswrapper[4878]: I1202 18:40:18.725867 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95f53b78e506c32b44282ca175883209aedbe0a4f04fd85a6b3f8de3911540b8" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.086638 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.229132 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-config-data\") pod \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.229303 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-scripts\") pod \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.229448 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-combined-ca-bundle\") pod \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.229487 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-run-httpd\") pod \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.229571 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-sg-core-conf-yaml\") pod \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.229665 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2rnz\" (UniqueName: \"kubernetes.io/projected/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-kube-api-access-r2rnz\") pod \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.229750 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-log-httpd\") pod \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\" (UID: \"81b2e402-c44e-414b-b0c5-fe49c0aaa75e\") " Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.231549 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "81b2e402-c44e-414b-b0c5-fe49c0aaa75e" (UID: "81b2e402-c44e-414b-b0c5-fe49c0aaa75e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.235593 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "81b2e402-c44e-414b-b0c5-fe49c0aaa75e" (UID: "81b2e402-c44e-414b-b0c5-fe49c0aaa75e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.278208 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-scripts" (OuterVolumeSpecName: "scripts") pod "81b2e402-c44e-414b-b0c5-fe49c0aaa75e" (UID: "81b2e402-c44e-414b-b0c5-fe49c0aaa75e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.279408 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-kube-api-access-r2rnz" (OuterVolumeSpecName: "kube-api-access-r2rnz") pod "81b2e402-c44e-414b-b0c5-fe49c0aaa75e" (UID: "81b2e402-c44e-414b-b0c5-fe49c0aaa75e"). InnerVolumeSpecName "kube-api-access-r2rnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.334904 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2rnz\" (UniqueName: \"kubernetes.io/projected/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-kube-api-access-r2rnz\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.334991 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.335003 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.335020 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:19 crc kubenswrapper[4878]: I1202 18:40:19.737265 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.031432 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "81b2e402-c44e-414b-b0c5-fe49c0aaa75e" (UID: "81b2e402-c44e-414b-b0c5-fe49c0aaa75e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.062372 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.121458 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81b2e402-c44e-414b-b0c5-fe49c0aaa75e" (UID: "81b2e402-c44e-414b-b0c5-fe49c0aaa75e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.128159 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-config-data" (OuterVolumeSpecName: "config-data") pod "81b2e402-c44e-414b-b0c5-fe49c0aaa75e" (UID: "81b2e402-c44e-414b-b0c5-fe49c0aaa75e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.164862 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.164909 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2e402-c44e-414b-b0c5-fe49c0aaa75e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.410594 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.443938 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.462820 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:20 crc kubenswrapper[4878]: E1202 18:40:20.463538 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-central-agent" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463560 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-central-agent" Dec 02 18:40:20 crc kubenswrapper[4878]: E1202 18:40:20.463605 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-notification-agent" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463612 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-notification-agent" Dec 02 18:40:20 crc kubenswrapper[4878]: E1202 18:40:20.463625 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="proxy-httpd" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463633 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="proxy-httpd" Dec 02 18:40:20 crc kubenswrapper[4878]: E1202 18:40:20.463646 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="sg-core" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463652 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="sg-core" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463919 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="sg-core" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463940 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-notification-agent" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463956 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="proxy-httpd" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.463974 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" containerName="ceilometer-central-agent" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.466411 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.469620 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.470011 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.494263 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.582115 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2kk\" (UniqueName: \"kubernetes.io/projected/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-kube-api-access-xr2kk\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.582172 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-scripts\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.582196 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-run-httpd\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.582217 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-config-data\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.582275 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.582349 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-log-httpd\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.582393 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.647421 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.685608 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-scripts\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.685652 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-run-httpd\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.685684 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-config-data\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.686402 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-run-httpd\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.685719 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.686706 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-log-httpd\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.686776 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.687101 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2kk\" (UniqueName: \"kubernetes.io/projected/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-kube-api-access-xr2kk\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.687161 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-log-httpd\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.697255 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-scripts\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.700707 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-config-data\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.728578 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.728992 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.737207 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2kk\" (UniqueName: \"kubernetes.io/projected/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-kube-api-access-xr2kk\") pod \"ceilometer-0\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.841552 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.846520 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db8f85467-fpfmd"] Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.847080 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" podUID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerName="dnsmasq-dns" containerID="cri-o://81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a" gracePeriod=10 Dec 02 18:40:20 crc kubenswrapper[4878]: I1202 18:40:20.931170 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerStarted","Data":"588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88"} Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.081008 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-log" containerID="cri-o://e45a554f8a6aaa8a881fbafd9cf2ba4ebe921a5881acad4d3b5a899d9422ce75" gracePeriod=30 Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.081204 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="da60f101-6061-4a42-9333-0c06d5f0e9b1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3ba806db5a3e223d7cede3e77dd56a6bcc72d2c6a1ddec1bf3736efb50760a00" gracePeriod=30 Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.081326 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-metadata" containerID="cri-o://d0ea1247fbbf818dd4a6371bf20b1a42dece706dd27e4cbde88b3300ed50a9ca" gracePeriod=30 Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.246802 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=6.025518572 podStartE2EDuration="12.24677557s" podCreationTimestamp="2025-12-02 18:40:09 +0000 UTC" firstStartedPulling="2025-12-02 18:40:12.518642134 +0000 UTC m=+1522.208261015" lastFinishedPulling="2025-12-02 18:40:18.739899132 +0000 UTC m=+1528.429518013" observedRunningTime="2025-12-02 18:40:21.224508592 +0000 UTC m=+1530.914127473" watchObservedRunningTime="2025-12-02 18:40:21.24677557 +0000 UTC m=+1530.936394451" Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.270184 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=6.113771859 podStartE2EDuration="12.270162283s" podCreationTimestamp="2025-12-02 18:40:09 +0000 UTC" firstStartedPulling="2025-12-02 18:40:12.587109892 +0000 UTC m=+1522.276728773" lastFinishedPulling="2025-12-02 18:40:18.743500316 +0000 UTC m=+1528.433119197" observedRunningTime="2025-12-02 18:40:21.257067463 +0000 UTC m=+1530.946686344" watchObservedRunningTime="2025-12-02 18:40:21.270162283 +0000 UTC m=+1530.959781184" Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.324470 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.6326996430000005 podStartE2EDuration="11.324429596s" podCreationTimestamp="2025-12-02 18:40:10 +0000 UTC" firstStartedPulling="2025-12-02 18:40:12.053004281 +0000 UTC m=+1521.742623162" lastFinishedPulling="2025-12-02 18:40:18.744734214 +0000 UTC m=+1528.434353115" observedRunningTime="2025-12-02 18:40:21.293583408 +0000 UTC m=+1530.983202289" watchObservedRunningTime="2025-12-02 18:40:21.324429596 +0000 UTC m=+1531.014048477" Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.341684 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=5.386987105 podStartE2EDuration="12.341663926s" podCreationTimestamp="2025-12-02 18:40:09 +0000 UTC" firstStartedPulling="2025-12-02 18:40:11.785224831 +0000 UTC m=+1521.474843712" lastFinishedPulling="2025-12-02 18:40:18.739901652 +0000 UTC m=+1528.429520533" observedRunningTime="2025-12-02 18:40:21.325102207 +0000 UTC m=+1531.014721078" watchObservedRunningTime="2025-12-02 18:40:21.341663926 +0000 UTC m=+1531.031282807" Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.496348 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b2e402-c44e-414b-b0c5-fe49c0aaa75e" path="/var/lib/kubelet/pods/81b2e402-c44e-414b-b0c5-fe49c0aaa75e/volumes" Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.497642 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"da60f101-6061-4a42-9333-0c06d5f0e9b1","Type":"ContainerStarted","Data":"3ba806db5a3e223d7cede3e77dd56a6bcc72d2c6a1ddec1bf3736efb50760a00"} Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.497694 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbbf6880-1398-4476-b0d0-d340f7231645","Type":"ContainerStarted","Data":"ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a"} Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.497717 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94a44ce8-08a9-4450-b78f-6fd4b53b090a","Type":"ContainerStarted","Data":"d0ea1247fbbf818dd4a6371bf20b1a42dece706dd27e4cbde88b3300ed50a9ca"} Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.497740 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94a44ce8-08a9-4450-b78f-6fd4b53b090a","Type":"ContainerStarted","Data":"e45a554f8a6aaa8a881fbafd9cf2ba4ebe921a5881acad4d3b5a899d9422ce75"} Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.497752 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"369791de-d9df-4032-a66f-859fae1cbb28","Type":"ContainerStarted","Data":"4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade"} Dec 02 18:40:21 crc kubenswrapper[4878]: I1202 18:40:21.497762 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"369791de-d9df-4032-a66f-859fae1cbb28","Type":"ContainerStarted","Data":"44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1"} Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.002535 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.114712 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.126525 4878 generic.go:334] "Generic (PLEG): container finished" podID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerID="81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a" exitCode=0 Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.126650 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" event={"ID":"843928c8-a35c-43b5-a1e6-88199ac743cc","Type":"ContainerDied","Data":"81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a"} Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.126683 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" event={"ID":"843928c8-a35c-43b5-a1e6-88199ac743cc","Type":"ContainerDied","Data":"7eda2f73d2784113e40295ab0f4cf641e0f8ab3a28da8ee57b34eeca9a5b053e"} Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.126706 4878 scope.go:117] "RemoveContainer" containerID="81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.144263 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerStarted","Data":"adcb4e3fe946a97c2bf9ae3545126a5377a091c4ac71d379b38518915dd62001"} Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.146278 4878 generic.go:334] "Generic (PLEG): container finished" podID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerID="e45a554f8a6aaa8a881fbafd9cf2ba4ebe921a5881acad4d3b5a899d9422ce75" exitCode=143 Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.147434 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94a44ce8-08a9-4450-b78f-6fd4b53b090a","Type":"ContainerDied","Data":"e45a554f8a6aaa8a881fbafd9cf2ba4ebe921a5881acad4d3b5a899d9422ce75"} Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.220318 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-sb\") pod \"843928c8-a35c-43b5-a1e6-88199ac743cc\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.220620 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-nb\") pod \"843928c8-a35c-43b5-a1e6-88199ac743cc\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.220816 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-config\") pod \"843928c8-a35c-43b5-a1e6-88199ac743cc\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.220924 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw258\" (UniqueName: \"kubernetes.io/projected/843928c8-a35c-43b5-a1e6-88199ac743cc-kube-api-access-qw258\") pod \"843928c8-a35c-43b5-a1e6-88199ac743cc\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.220996 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-svc\") pod \"843928c8-a35c-43b5-a1e6-88199ac743cc\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.221096 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-swift-storage-0\") pod \"843928c8-a35c-43b5-a1e6-88199ac743cc\" (UID: \"843928c8-a35c-43b5-a1e6-88199ac743cc\") " Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.245649 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843928c8-a35c-43b5-a1e6-88199ac743cc-kube-api-access-qw258" (OuterVolumeSpecName: "kube-api-access-qw258") pod "843928c8-a35c-43b5-a1e6-88199ac743cc" (UID: "843928c8-a35c-43b5-a1e6-88199ac743cc"). InnerVolumeSpecName "kube-api-access-qw258". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.260438 4878 scope.go:117] "RemoveContainer" containerID="7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.338366 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw258\" (UniqueName: \"kubernetes.io/projected/843928c8-a35c-43b5-a1e6-88199ac743cc-kube-api-access-qw258\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.394547 4878 scope.go:117] "RemoveContainer" containerID="81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a" Dec 02 18:40:22 crc kubenswrapper[4878]: E1202 18:40:22.408170 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a\": container with ID starting with 81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a not found: ID does not exist" containerID="81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.408273 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a"} err="failed to get container status \"81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a\": rpc error: code = NotFound desc = could not find container \"81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a\": container with ID starting with 81287d7ab892f600a3d3636669f2e13032d25a2a9f29f73b230c6f9dce77378a not found: ID does not exist" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.408306 4878 scope.go:117] "RemoveContainer" containerID="7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382" Dec 02 18:40:22 crc kubenswrapper[4878]: E1202 18:40:22.419373 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382\": container with ID starting with 7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382 not found: ID does not exist" containerID="7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.419426 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382"} err="failed to get container status \"7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382\": rpc error: code = NotFound desc = could not find container \"7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382\": container with ID starting with 7a28009b49985ef5e625fae9c33c7c1cb36fc9bee9a84bd5558e87d300441382 not found: ID does not exist" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.427042 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "843928c8-a35c-43b5-a1e6-88199ac743cc" (UID: "843928c8-a35c-43b5-a1e6-88199ac743cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.436310 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "843928c8-a35c-43b5-a1e6-88199ac743cc" (UID: "843928c8-a35c-43b5-a1e6-88199ac743cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.438591 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "843928c8-a35c-43b5-a1e6-88199ac743cc" (UID: "843928c8-a35c-43b5-a1e6-88199ac743cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.441653 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.441678 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.441690 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.450113 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "843928c8-a35c-43b5-a1e6-88199ac743cc" (UID: "843928c8-a35c-43b5-a1e6-88199ac743cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.490855 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-config" (OuterVolumeSpecName: "config") pod "843928c8-a35c-43b5-a1e6-88199ac743cc" (UID: "843928c8-a35c-43b5-a1e6-88199ac743cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.546619 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.546887 4878 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/843928c8-a35c-43b5-a1e6-88199ac743cc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.801510 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:40:22 crc kubenswrapper[4878]: I1202 18:40:22.804116 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3bc2339a-313d-485a-b67f-d18b597c36e5" containerName="kube-state-metrics" containerID="cri-o://bd37ed7af3c342730a3178bc613dab9f0723342a4e89cdeb62d8567806f3f9e2" gracePeriod=30 Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.036477 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.036762 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" containerName="mysqld-exporter" containerID="cri-o://97376565e713d1ae53c4c2732e707a792e928bf838763d84edd735934089b938" gracePeriod=30 Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.271945 4878 generic.go:334] "Generic (PLEG): container finished" podID="3bc2339a-313d-485a-b67f-d18b597c36e5" containerID="bd37ed7af3c342730a3178bc613dab9f0723342a4e89cdeb62d8567806f3f9e2" exitCode=2 Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.272313 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3bc2339a-313d-485a-b67f-d18b597c36e5","Type":"ContainerDied","Data":"bd37ed7af3c342730a3178bc613dab9f0723342a4e89cdeb62d8567806f3f9e2"} Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.288802 4878 generic.go:334] "Generic (PLEG): container finished" podID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerID="d0ea1247fbbf818dd4a6371bf20b1a42dece706dd27e4cbde88b3300ed50a9ca" exitCode=0 Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.288930 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94a44ce8-08a9-4450-b78f-6fd4b53b090a","Type":"ContainerDied","Data":"d0ea1247fbbf818dd4a6371bf20b1a42dece706dd27e4cbde88b3300ed50a9ca"} Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.318095 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db8f85467-fpfmd" Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.328034 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerStarted","Data":"7ece543cc46d2bcf68613e78d4d588c5caf1eeeb94490b86729156f34f24c52f"} Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.374650 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db8f85467-fpfmd"] Dec 02 18:40:23 crc kubenswrapper[4878]: I1202 18:40:23.386039 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5db8f85467-fpfmd"] Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.369270 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3bc2339a-313d-485a-b67f-d18b597c36e5","Type":"ContainerDied","Data":"11176b2624068f064077e9bf2f2268b44fe86d9cc335351bf51914d6031a3905"} Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.369599 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11176b2624068f064077e9bf2f2268b44fe86d9cc335351bf51914d6031a3905" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.384199 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"94a44ce8-08a9-4450-b78f-6fd4b53b090a","Type":"ContainerDied","Data":"70243cc4246a4e3080f1363be2221379e0e559ad06fc47517b2771482ea0fa35"} Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.384273 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70243cc4246a4e3080f1363be2221379e0e559ad06fc47517b2771482ea0fa35" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.389077 4878 generic.go:334] "Generic (PLEG): container finished" podID="a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" containerID="97376565e713d1ae53c4c2732e707a792e928bf838763d84edd735934089b938" exitCode=2 Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.389137 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54","Type":"ContainerDied","Data":"97376565e713d1ae53c4c2732e707a792e928bf838763d84edd735934089b938"} Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.389172 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54","Type":"ContainerDied","Data":"c7df923115c99927b6e20585f5379acc66689bb69eed251c72d5631e52112ac5"} Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.389187 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7df923115c99927b6e20585f5379acc66689bb69eed251c72d5631e52112ac5" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.435778 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.453028 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.457923 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.555623 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-config-data\") pod \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.555697 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnk88\" (UniqueName: \"kubernetes.io/projected/94a44ce8-08a9-4450-b78f-6fd4b53b090a-kube-api-access-xnk88\") pod \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.555767 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-config-data\") pod \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.555787 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk6r5\" (UniqueName: \"kubernetes.io/projected/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-kube-api-access-wk6r5\") pod \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.555923 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-combined-ca-bundle\") pod \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.555971 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a44ce8-08a9-4450-b78f-6fd4b53b090a-logs\") pod \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\" (UID: \"94a44ce8-08a9-4450-b78f-6fd4b53b090a\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.556032 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hlzc\" (UniqueName: \"kubernetes.io/projected/3bc2339a-313d-485a-b67f-d18b597c36e5-kube-api-access-5hlzc\") pod \"3bc2339a-313d-485a-b67f-d18b597c36e5\" (UID: \"3bc2339a-313d-485a-b67f-d18b597c36e5\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.556083 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-combined-ca-bundle\") pod \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\" (UID: \"a5cf94af-3f03-4d2f-9f7b-bb91d0322c54\") " Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.559709 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a44ce8-08a9-4450-b78f-6fd4b53b090a-logs" (OuterVolumeSpecName: "logs") pod "94a44ce8-08a9-4450-b78f-6fd4b53b090a" (UID: "94a44ce8-08a9-4450-b78f-6fd4b53b090a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.570045 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-kube-api-access-wk6r5" (OuterVolumeSpecName: "kube-api-access-wk6r5") pod "a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" (UID: "a5cf94af-3f03-4d2f-9f7b-bb91d0322c54"). InnerVolumeSpecName "kube-api-access-wk6r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.597991 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a44ce8-08a9-4450-b78f-6fd4b53b090a-kube-api-access-xnk88" (OuterVolumeSpecName: "kube-api-access-xnk88") pod "94a44ce8-08a9-4450-b78f-6fd4b53b090a" (UID: "94a44ce8-08a9-4450-b78f-6fd4b53b090a"). InnerVolumeSpecName "kube-api-access-xnk88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.625682 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc2339a-313d-485a-b67f-d18b597c36e5-kube-api-access-5hlzc" (OuterVolumeSpecName: "kube-api-access-5hlzc") pod "3bc2339a-313d-485a-b67f-d18b597c36e5" (UID: "3bc2339a-313d-485a-b67f-d18b597c36e5"). InnerVolumeSpecName "kube-api-access-5hlzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.661194 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94a44ce8-08a9-4450-b78f-6fd4b53b090a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.661227 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hlzc\" (UniqueName: \"kubernetes.io/projected/3bc2339a-313d-485a-b67f-d18b597c36e5-kube-api-access-5hlzc\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.661254 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnk88\" (UniqueName: \"kubernetes.io/projected/94a44ce8-08a9-4450-b78f-6fd4b53b090a-kube-api-access-xnk88\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.661265 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk6r5\" (UniqueName: \"kubernetes.io/projected/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-kube-api-access-wk6r5\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.669813 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" (UID: "a5cf94af-3f03-4d2f-9f7b-bb91d0322c54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.716275 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94a44ce8-08a9-4450-b78f-6fd4b53b090a" (UID: "94a44ce8-08a9-4450-b78f-6fd4b53b090a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.746652 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-config-data" (OuterVolumeSpecName: "config-data") pod "94a44ce8-08a9-4450-b78f-6fd4b53b090a" (UID: "94a44ce8-08a9-4450-b78f-6fd4b53b090a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.783138 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.783176 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.791616 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a44ce8-08a9-4450-b78f-6fd4b53b090a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.800546 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-config-data" (OuterVolumeSpecName: "config-data") pod "a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" (UID: "a5cf94af-3f03-4d2f-9f7b-bb91d0322c54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.894716 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:24 crc kubenswrapper[4878]: I1202 18:40:24.973389 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843928c8-a35c-43b5-a1e6-88199ac743cc" path="/var/lib/kubelet/pods/843928c8-a35c-43b5-a1e6-88199ac743cc/volumes" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.351982 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.404198 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.406528 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerStarted","Data":"615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19"} Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.406829 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-api" containerID="cri-o://93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601" gracePeriod=30 Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.407593 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-listener" containerID="cri-o://615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19" gracePeriod=30 Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.407715 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-notifier" containerID="cri-o://588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88" gracePeriod=30 Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.407826 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-evaluator" containerID="cri-o://53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98" gracePeriod=30 Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.417513 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.417612 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerStarted","Data":"2a6593d91b422dc79f67891cd29f8ded1580ae110ad3f86841d09b8b168b0fd8"} Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.417667 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.417747 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.436197 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.149122105 podStartE2EDuration="20.436172519s" podCreationTimestamp="2025-12-02 18:40:05 +0000 UTC" firstStartedPulling="2025-12-02 18:40:07.017492892 +0000 UTC m=+1516.707111773" lastFinishedPulling="2025-12-02 18:40:24.304543306 +0000 UTC m=+1533.994162187" observedRunningTime="2025-12-02 18:40:25.432156284 +0000 UTC m=+1535.121775185" watchObservedRunningTime="2025-12-02 18:40:25.436172519 +0000 UTC m=+1535.125791400" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.518703 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.562339 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.610502 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.635376 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.652812 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: E1202 18:40:25.653627 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerName="dnsmasq-dns" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.653648 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerName="dnsmasq-dns" Dec 02 18:40:25 crc kubenswrapper[4878]: E1202 18:40:25.653701 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerName="init" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.653711 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerName="init" Dec 02 18:40:25 crc kubenswrapper[4878]: E1202 18:40:25.653729 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" containerName="mysqld-exporter" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.653741 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" containerName="mysqld-exporter" Dec 02 18:40:25 crc kubenswrapper[4878]: E1202 18:40:25.653755 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc2339a-313d-485a-b67f-d18b597c36e5" containerName="kube-state-metrics" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.653764 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc2339a-313d-485a-b67f-d18b597c36e5" containerName="kube-state-metrics" Dec 02 18:40:25 crc kubenswrapper[4878]: E1202 18:40:25.653794 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-metadata" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.653802 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-metadata" Dec 02 18:40:25 crc kubenswrapper[4878]: E1202 18:40:25.653821 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-log" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.653828 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-log" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.655763 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" containerName="mysqld-exporter" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.655792 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-log" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.655799 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="843928c8-a35c-43b5-a1e6-88199ac743cc" containerName="dnsmasq-dns" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.655817 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc2339a-313d-485a-b67f-d18b597c36e5" containerName="kube-state-metrics" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.655833 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" containerName="nova-metadata-metadata" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.657328 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.665693 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.665886 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.669282 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.670911 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.673544 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.674964 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.689943 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.705134 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.724921 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.742452 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.766670 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.768723 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.773616 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.773849 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.781020 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.841525 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.841635 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.841684 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.841750 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzpj5\" (UniqueName: \"kubernetes.io/projected/6ef40d82-36b6-4b25-879e-93b3fcefe72d-kube-api-access-dzpj5\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.841780 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.841881 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40d82-36b6-4b25-879e-93b3fcefe72d-logs\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.841983 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.842058 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvs4\" (UniqueName: \"kubernetes.io/projected/763df430-6f7f-4642-9452-1fcc5d47d283-kube-api-access-kfvs4\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.842341 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-config-data\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.948165 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-config-data\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.948529 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.948614 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-config-data\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951394 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951488 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951523 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951564 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzpj5\" (UniqueName: \"kubernetes.io/projected/6ef40d82-36b6-4b25-879e-93b3fcefe72d-kube-api-access-dzpj5\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951631 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951709 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40d82-36b6-4b25-879e-93b3fcefe72d-logs\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951804 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4lwh\" (UniqueName: \"kubernetes.io/projected/1bbb9528-81ba-487f-bf86-44276e8ac969-kube-api-access-s4lwh\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951872 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951911 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.951955 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvs4\" (UniqueName: \"kubernetes.io/projected/763df430-6f7f-4642-9452-1fcc5d47d283-kube-api-access-kfvs4\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.956769 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40d82-36b6-4b25-879e-93b3fcefe72d-logs\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.961328 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-config-data\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.962121 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.962311 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.962591 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.963539 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.968888 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763df430-6f7f-4642-9452-1fcc5d47d283-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.989862 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvs4\" (UniqueName: \"kubernetes.io/projected/763df430-6f7f-4642-9452-1fcc5d47d283-kube-api-access-kfvs4\") pod \"kube-state-metrics-0\" (UID: \"763df430-6f7f-4642-9452-1fcc5d47d283\") " pod="openstack/kube-state-metrics-0" Dec 02 18:40:25 crc kubenswrapper[4878]: I1202 18:40:25.993867 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzpj5\" (UniqueName: \"kubernetes.io/projected/6ef40d82-36b6-4b25-879e-93b3fcefe72d-kube-api-access-dzpj5\") pod \"nova-metadata-0\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " pod="openstack/nova-metadata-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.054500 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4lwh\" (UniqueName: \"kubernetes.io/projected/1bbb9528-81ba-487f-bf86-44276e8ac969-kube-api-access-s4lwh\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.054581 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.054695 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-config-data\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.054719 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.061087 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-config-data\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.061544 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.061857 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bbb9528-81ba-487f-bf86-44276e8ac969-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.080024 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4lwh\" (UniqueName: \"kubernetes.io/projected/1bbb9528-81ba-487f-bf86-44276e8ac969-kube-api-access-s4lwh\") pod \"mysqld-exporter-0\" (UID: \"1bbb9528-81ba-487f-bf86-44276e8ac969\") " pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.144325 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.251272 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.368366 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.481902 4878 generic.go:334] "Generic (PLEG): container finished" podID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerID="53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98" exitCode=0 Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.481945 4878 generic.go:334] "Generic (PLEG): container finished" podID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerID="93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601" exitCode=0 Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.481998 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerDied","Data":"53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98"} Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.482032 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerDied","Data":"93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601"} Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.499012 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerStarted","Data":"85b18c21ba35742b2b11f4209d43400b0de25e8ac1ece7c40ef08faf02430cfe"} Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.782367 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.973718 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc2339a-313d-485a-b67f-d18b597c36e5" path="/var/lib/kubelet/pods/3bc2339a-313d-485a-b67f-d18b597c36e5/volumes" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.980737 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a44ce8-08a9-4450-b78f-6fd4b53b090a" path="/var/lib/kubelet/pods/94a44ce8-08a9-4450-b78f-6fd4b53b090a/volumes" Dec 02 18:40:26 crc kubenswrapper[4878]: I1202 18:40:26.984438 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5cf94af-3f03-4d2f-9f7b-bb91d0322c54" path="/var/lib/kubelet/pods/a5cf94af-3f03-4d2f-9f7b-bb91d0322c54/volumes" Dec 02 18:40:26 crc kubenswrapper[4878]: W1202 18:40:26.986424 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod763df430_6f7f_4642_9452_1fcc5d47d283.slice/crio-4c221b85b1ba5d35c854442905ab5c20c0b52d0d108ca695ec70c55686d28f9a WatchSource:0}: Error finding container 4c221b85b1ba5d35c854442905ab5c20c0b52d0d108ca695ec70c55686d28f9a: Status 404 returned error can't find the container with id 4c221b85b1ba5d35c854442905ab5c20c0b52d0d108ca695ec70c55686d28f9a Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.010722 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.043332 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.143952 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.540122 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"763df430-6f7f-4642-9452-1fcc5d47d283","Type":"ContainerStarted","Data":"4c221b85b1ba5d35c854442905ab5c20c0b52d0d108ca695ec70c55686d28f9a"} Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.551874 4878 generic.go:334] "Generic (PLEG): container finished" podID="dda2e4cb-b62a-40c0-a3d8-5a427b609472" containerID="9b5a8a7e2fd581e2c6f7082289e64b5fdc9d48505b6ef1a828d6d42788b56404" exitCode=0 Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.552245 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w76q8" event={"ID":"dda2e4cb-b62a-40c0-a3d8-5a427b609472","Type":"ContainerDied","Data":"9b5a8a7e2fd581e2c6f7082289e64b5fdc9d48505b6ef1a828d6d42788b56404"} Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.559999 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"1bbb9528-81ba-487f-bf86-44276e8ac969","Type":"ContainerStarted","Data":"41f49389dfa2861a60cb7760ac6b7c667bc8e6c477cdb87f271c73d080d85eed"} Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.561687 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ef40d82-36b6-4b25-879e-93b3fcefe72d","Type":"ContainerStarted","Data":"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862"} Dec 02 18:40:27 crc kubenswrapper[4878]: I1202 18:40:27.561717 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ef40d82-36b6-4b25-879e-93b3fcefe72d","Type":"ContainerStarted","Data":"fd3a8c98a60318e1c3c124023b3e53fdd9d109a344a446749d1e928699913730"} Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.582201 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"1bbb9528-81ba-487f-bf86-44276e8ac969","Type":"ContainerStarted","Data":"73b5c822e8877eaf2e59514c04d9ee81857545e9b234b3f19750d68b5cecc6fa"} Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.586472 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ef40d82-36b6-4b25-879e-93b3fcefe72d","Type":"ContainerStarted","Data":"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849"} Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.590930 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"763df430-6f7f-4642-9452-1fcc5d47d283","Type":"ContainerStarted","Data":"39bda99dfc5df6651570f5d8bb946e0b680ad375ffd9eb769b252cf5ff69a01d"} Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.591080 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.596063 4878 generic.go:334] "Generic (PLEG): container finished" podID="e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" containerID="c71c2f290ea3c39e21bca870922144c336e0f472d32c341c42049e2f5691e7ec" exitCode=0 Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.596160 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rj842" event={"ID":"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a","Type":"ContainerDied","Data":"c71c2f290ea3c39e21bca870922144c336e0f472d32c341c42049e2f5691e7ec"} Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.600496 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerStarted","Data":"8bfc16f4aa576478f10a317116d84307e1967c9da49cc336ebdc615fdf9e7a74"} Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.600693 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-central-agent" containerID="cri-o://7ece543cc46d2bcf68613e78d4d588c5caf1eeeb94490b86729156f34f24c52f" gracePeriod=30 Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.600991 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.601035 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="proxy-httpd" containerID="cri-o://8bfc16f4aa576478f10a317116d84307e1967c9da49cc336ebdc615fdf9e7a74" gracePeriod=30 Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.601083 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="sg-core" containerID="cri-o://85b18c21ba35742b2b11f4209d43400b0de25e8ac1ece7c40ef08faf02430cfe" gracePeriod=30 Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.601124 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-notification-agent" containerID="cri-o://2a6593d91b422dc79f67891cd29f8ded1580ae110ad3f86841d09b8b168b0fd8" gracePeriod=30 Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.624216 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.019579697 podStartE2EDuration="3.624190551s" podCreationTimestamp="2025-12-02 18:40:25 +0000 UTC" firstStartedPulling="2025-12-02 18:40:27.149086724 +0000 UTC m=+1536.838705605" lastFinishedPulling="2025-12-02 18:40:27.753697578 +0000 UTC m=+1537.443316459" observedRunningTime="2025-12-02 18:40:28.606928079 +0000 UTC m=+1538.296546980" watchObservedRunningTime="2025-12-02 18:40:28.624190551 +0000 UTC m=+1538.313809432" Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.669153 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.66910143 podStartE2EDuration="3.66910143s" podCreationTimestamp="2025-12-02 18:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:28.663307288 +0000 UTC m=+1538.352926179" watchObservedRunningTime="2025-12-02 18:40:28.66910143 +0000 UTC m=+1538.358720311" Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.687389 4878 generic.go:334] "Generic (PLEG): container finished" podID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerID="588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88" exitCode=0 Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.687707 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerDied","Data":"588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88"} Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.701971 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.225419713 podStartE2EDuration="3.701944589s" podCreationTimestamp="2025-12-02 18:40:25 +0000 UTC" firstStartedPulling="2025-12-02 18:40:26.988538459 +0000 UTC m=+1536.678157340" lastFinishedPulling="2025-12-02 18:40:27.465063335 +0000 UTC m=+1537.154682216" observedRunningTime="2025-12-02 18:40:28.686782883 +0000 UTC m=+1538.376401764" watchObservedRunningTime="2025-12-02 18:40:28.701944589 +0000 UTC m=+1538.391563470" Dec 02 18:40:28 crc kubenswrapper[4878]: I1202 18:40:28.798305 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.395961908 podStartE2EDuration="8.7982714s" podCreationTimestamp="2025-12-02 18:40:20 +0000 UTC" firstStartedPulling="2025-12-02 18:40:22.01801188 +0000 UTC m=+1531.707630761" lastFinishedPulling="2025-12-02 18:40:27.420321372 +0000 UTC m=+1537.109940253" observedRunningTime="2025-12-02 18:40:28.711404746 +0000 UTC m=+1538.401023647" watchObservedRunningTime="2025-12-02 18:40:28.7982714 +0000 UTC m=+1538.487890281" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.236831 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.387459 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-config-data\") pod \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.387898 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-combined-ca-bundle\") pod \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.388445 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-scripts\") pod \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.388588 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmrch\" (UniqueName: \"kubernetes.io/projected/dda2e4cb-b62a-40c0-a3d8-5a427b609472-kube-api-access-rmrch\") pod \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\" (UID: \"dda2e4cb-b62a-40c0-a3d8-5a427b609472\") " Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.394159 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-scripts" (OuterVolumeSpecName: "scripts") pod "dda2e4cb-b62a-40c0-a3d8-5a427b609472" (UID: "dda2e4cb-b62a-40c0-a3d8-5a427b609472"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.394963 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda2e4cb-b62a-40c0-a3d8-5a427b609472-kube-api-access-rmrch" (OuterVolumeSpecName: "kube-api-access-rmrch") pod "dda2e4cb-b62a-40c0-a3d8-5a427b609472" (UID: "dda2e4cb-b62a-40c0-a3d8-5a427b609472"). InnerVolumeSpecName "kube-api-access-rmrch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.427220 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-config-data" (OuterVolumeSpecName: "config-data") pod "dda2e4cb-b62a-40c0-a3d8-5a427b609472" (UID: "dda2e4cb-b62a-40c0-a3d8-5a427b609472"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.440948 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dda2e4cb-b62a-40c0-a3d8-5a427b609472" (UID: "dda2e4cb-b62a-40c0-a3d8-5a427b609472"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.492427 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.492471 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.492485 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dda2e4cb-b62a-40c0-a3d8-5a427b609472-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.492497 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmrch\" (UniqueName: \"kubernetes.io/projected/dda2e4cb-b62a-40c0-a3d8-5a427b609472-kube-api-access-rmrch\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.652052 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 18:40:29 crc kubenswrapper[4878]: E1202 18:40:29.652672 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda2e4cb-b62a-40c0-a3d8-5a427b609472" containerName="nova-cell1-conductor-db-sync" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.652688 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda2e4cb-b62a-40c0-a3d8-5a427b609472" containerName="nova-cell1-conductor-db-sync" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.652972 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda2e4cb-b62a-40c0-a3d8-5a427b609472" containerName="nova-cell1-conductor-db-sync" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.654042 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.668202 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.745115 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w76q8" event={"ID":"dda2e4cb-b62a-40c0-a3d8-5a427b609472","Type":"ContainerDied","Data":"d89510e39bdaa3cf02e0a4f7487f58ec7e1a7e88d1a36be02f37e76d26ceffcc"} Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.745169 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89510e39bdaa3cf02e0a4f7487f58ec7e1a7e88d1a36be02f37e76d26ceffcc" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.745274 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w76q8" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.756149 4878 generic.go:334] "Generic (PLEG): container finished" podID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerID="8bfc16f4aa576478f10a317116d84307e1967c9da49cc336ebdc615fdf9e7a74" exitCode=0 Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.756193 4878 generic.go:334] "Generic (PLEG): container finished" podID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerID="85b18c21ba35742b2b11f4209d43400b0de25e8ac1ece7c40ef08faf02430cfe" exitCode=2 Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.756203 4878 generic.go:334] "Generic (PLEG): container finished" podID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerID="2a6593d91b422dc79f67891cd29f8ded1580ae110ad3f86841d09b8b168b0fd8" exitCode=0 Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.757550 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerDied","Data":"8bfc16f4aa576478f10a317116d84307e1967c9da49cc336ebdc615fdf9e7a74"} Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.757588 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerDied","Data":"85b18c21ba35742b2b11f4209d43400b0de25e8ac1ece7c40ef08faf02430cfe"} Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.757600 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerDied","Data":"2a6593d91b422dc79f67891cd29f8ded1580ae110ad3f86841d09b8b168b0fd8"} Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.812250 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49118006-cedb-4f41-a752-c635108e2bf7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.812643 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49118006-cedb-4f41-a752-c635108e2bf7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.812813 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqpg5\" (UniqueName: \"kubernetes.io/projected/49118006-cedb-4f41-a752-c635108e2bf7-kube-api-access-bqpg5\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.915223 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49118006-cedb-4f41-a752-c635108e2bf7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.915320 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqpg5\" (UniqueName: \"kubernetes.io/projected/49118006-cedb-4f41-a752-c635108e2bf7-kube-api-access-bqpg5\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.915500 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49118006-cedb-4f41-a752-c635108e2bf7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.934080 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49118006-cedb-4f41-a752-c635108e2bf7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.938988 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49118006-cedb-4f41-a752-c635108e2bf7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:29 crc kubenswrapper[4878]: I1202 18:40:29.943225 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqpg5\" (UniqueName: \"kubernetes.io/projected/49118006-cedb-4f41-a752-c635108e2bf7-kube-api-access-bqpg5\") pod \"nova-cell1-conductor-0\" (UID: \"49118006-cedb-4f41-a752-c635108e2bf7\") " pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.035632 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.173688 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.325825 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-combined-ca-bundle\") pod \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.325894 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrbfj\" (UniqueName: \"kubernetes.io/projected/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-kube-api-access-xrbfj\") pod \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.326077 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-config-data\") pod \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.326153 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-scripts\") pod \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\" (UID: \"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a\") " Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.331793 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-scripts" (OuterVolumeSpecName: "scripts") pod "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" (UID: "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.332591 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-kube-api-access-xrbfj" (OuterVolumeSpecName: "kube-api-access-xrbfj") pod "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" (UID: "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a"). InnerVolumeSpecName "kube-api-access-xrbfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.351648 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.376567 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-config-data" (OuterVolumeSpecName: "config-data") pod "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" (UID: "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.380328 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" (UID: "e1569008-ec65-4f71-bde2-2ee2ea8c2e7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.410981 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.431194 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrbfj\" (UniqueName: \"kubernetes.io/projected/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-kube-api-access-xrbfj\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.431248 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.431260 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.431269 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.573270 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.604383 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.604448 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.806713 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"49118006-cedb-4f41-a752-c635108e2bf7","Type":"ContainerStarted","Data":"1b5a7594e720fabcd00338deda85f89bbbfb7aa94a1532f1b1ff3e316b1c3797"} Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.821158 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.821674 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rj842" event={"ID":"e1569008-ec65-4f71-bde2-2ee2ea8c2e7a","Type":"ContainerDied","Data":"60f831f0fefddcec79ab945768e24c075b29fdcfc46955708cb8207d8b8d75ef"} Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.821755 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f831f0fefddcec79ab945768e24c075b29fdcfc46955708cb8207d8b8d75ef" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.821858 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rj842" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.822700 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-log" containerID="cri-o://44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1" gracePeriod=30 Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.823073 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-api" containerID="cri-o://4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade" gracePeriod=30 Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.835219 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.241:8774/\": EOF" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.850968 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.241:8774/\": EOF" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.889514 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.903021 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.909796 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.910007 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-log" containerID="cri-o://33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862" gracePeriod=30 Dec 02 18:40:30 crc kubenswrapper[4878]: I1202 18:40:30.910112 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-metadata" containerID="cri-o://265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849" gracePeriod=30 Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.146543 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.146949 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.796285 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.835443 4878 generic.go:334] "Generic (PLEG): container finished" podID="369791de-d9df-4032-a66f-859fae1cbb28" containerID="44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1" exitCode=143 Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.835526 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"369791de-d9df-4032-a66f-859fae1cbb28","Type":"ContainerDied","Data":"44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1"} Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.838016 4878 generic.go:334] "Generic (PLEG): container finished" podID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerID="265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849" exitCode=0 Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.838052 4878 generic.go:334] "Generic (PLEG): container finished" podID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerID="33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862" exitCode=143 Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.838111 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ef40d82-36b6-4b25-879e-93b3fcefe72d","Type":"ContainerDied","Data":"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849"} Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.838154 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ef40d82-36b6-4b25-879e-93b3fcefe72d","Type":"ContainerDied","Data":"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862"} Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.838170 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ef40d82-36b6-4b25-879e-93b3fcefe72d","Type":"ContainerDied","Data":"fd3a8c98a60318e1c3c124023b3e53fdd9d109a344a446749d1e928699913730"} Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.838190 4878 scope.go:117] "RemoveContainer" containerID="265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.838514 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.889958 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cbbf6880-1398-4476-b0d0-d340f7231645" containerName="nova-scheduler-scheduler" containerID="cri-o://ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" gracePeriod=30 Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.890756 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"49118006-cedb-4f41-a752-c635108e2bf7","Type":"ContainerStarted","Data":"3b10a40a344255a41dc19d9c8a0034fa24091880e0da13b3f822990a693d6a3e"} Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.891985 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.901598 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-combined-ca-bundle\") pod \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.901664 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40d82-36b6-4b25-879e-93b3fcefe72d-logs\") pod \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.901831 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-config-data\") pod \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.901856 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-nova-metadata-tls-certs\") pod \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.902015 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzpj5\" (UniqueName: \"kubernetes.io/projected/6ef40d82-36b6-4b25-879e-93b3fcefe72d-kube-api-access-dzpj5\") pod \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\" (UID: \"6ef40d82-36b6-4b25-879e-93b3fcefe72d\") " Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.902711 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef40d82-36b6-4b25-879e-93b3fcefe72d-logs" (OuterVolumeSpecName: "logs") pod "6ef40d82-36b6-4b25-879e-93b3fcefe72d" (UID: "6ef40d82-36b6-4b25-879e-93b3fcefe72d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.919465 4878 scope.go:117] "RemoveContainer" containerID="33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.920703 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef40d82-36b6-4b25-879e-93b3fcefe72d-kube-api-access-dzpj5" (OuterVolumeSpecName: "kube-api-access-dzpj5") pod "6ef40d82-36b6-4b25-879e-93b3fcefe72d" (UID: "6ef40d82-36b6-4b25-879e-93b3fcefe72d"). InnerVolumeSpecName "kube-api-access-dzpj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.956280 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.956250449 podStartE2EDuration="2.956250449s" podCreationTimestamp="2025-12-02 18:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:31.929640685 +0000 UTC m=+1541.619259566" watchObservedRunningTime="2025-12-02 18:40:31.956250449 +0000 UTC m=+1541.645869330" Dec 02 18:40:31 crc kubenswrapper[4878]: I1202 18:40:31.971351 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef40d82-36b6-4b25-879e-93b3fcefe72d" (UID: "6ef40d82-36b6-4b25-879e-93b3fcefe72d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.041858 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzpj5\" (UniqueName: \"kubernetes.io/projected/6ef40d82-36b6-4b25-879e-93b3fcefe72d-kube-api-access-dzpj5\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.042573 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-config-data" (OuterVolumeSpecName: "config-data") pod "6ef40d82-36b6-4b25-879e-93b3fcefe72d" (UID: "6ef40d82-36b6-4b25-879e-93b3fcefe72d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.043328 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.045366 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef40d82-36b6-4b25-879e-93b3fcefe72d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.078423 4878 scope.go:117] "RemoveContainer" containerID="265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849" Dec 02 18:40:32 crc kubenswrapper[4878]: E1202 18:40:32.088030 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849\": container with ID starting with 265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849 not found: ID does not exist" containerID="265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.088104 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849"} err="failed to get container status \"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849\": rpc error: code = NotFound desc = could not find container \"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849\": container with ID starting with 265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849 not found: ID does not exist" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.088136 4878 scope.go:117] "RemoveContainer" containerID="33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862" Dec 02 18:40:32 crc kubenswrapper[4878]: E1202 18:40:32.092675 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862\": container with ID starting with 33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862 not found: ID does not exist" containerID="33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.092739 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862"} err="failed to get container status \"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862\": rpc error: code = NotFound desc = could not find container \"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862\": container with ID starting with 33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862 not found: ID does not exist" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.092772 4878 scope.go:117] "RemoveContainer" containerID="265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.093400 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849"} err="failed to get container status \"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849\": rpc error: code = NotFound desc = could not find container \"265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849\": container with ID starting with 265f25a3abea0aa99e20c89e1f13482b6c630d931bb8cc97492b162dbea2b849 not found: ID does not exist" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.093456 4878 scope.go:117] "RemoveContainer" containerID="33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.094001 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862"} err="failed to get container status \"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862\": rpc error: code = NotFound desc = could not find container \"33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862\": container with ID starting with 33572526f3ef15439e62cfe7c335b28c1e4cf0eb4faf0128208ba2f5659a7862 not found: ID does not exist" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.140822 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6ef40d82-36b6-4b25-879e-93b3fcefe72d" (UID: "6ef40d82-36b6-4b25-879e-93b3fcefe72d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.167869 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.167905 4878 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef40d82-36b6-4b25-879e-93b3fcefe72d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.206594 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.228190 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.247375 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:32 crc kubenswrapper[4878]: E1202 18:40:32.248103 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" containerName="nova-manage" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.248120 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" containerName="nova-manage" Dec 02 18:40:32 crc kubenswrapper[4878]: E1202 18:40:32.248165 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-log" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.248173 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-log" Dec 02 18:40:32 crc kubenswrapper[4878]: E1202 18:40:32.248181 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-metadata" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.248191 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-metadata" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.248518 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" containerName="nova-manage" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.248559 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-log" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.248574 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" containerName="nova-metadata-metadata" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.251008 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.255783 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.256810 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.273135 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbh4\" (UniqueName: \"kubernetes.io/projected/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-kube-api-access-5mbh4\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.273334 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-config-data\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.273396 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.273420 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.273513 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-logs\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.279915 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.376989 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbh4\" (UniqueName: \"kubernetes.io/projected/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-kube-api-access-5mbh4\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.377222 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-config-data\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.377310 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.377339 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.377428 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-logs\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.378249 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-logs\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.382463 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-config-data\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.382920 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.383961 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.395021 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbh4\" (UniqueName: \"kubernetes.io/projected/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-kube-api-access-5mbh4\") pod \"nova-metadata-0\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.600367 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:40:32 crc kubenswrapper[4878]: I1202 18:40:32.962545 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef40d82-36b6-4b25-879e-93b3fcefe72d" path="/var/lib/kubelet/pods/6ef40d82-36b6-4b25-879e-93b3fcefe72d/volumes" Dec 02 18:40:33 crc kubenswrapper[4878]: W1202 18:40:33.175643 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbbde98c_d7a1_405c_ab24_7e2d8fb5effd.slice/crio-100243abe35d0ceef9a949b1c8906f9c814323ad7a1b6679d3f6b9d1717b8d54 WatchSource:0}: Error finding container 100243abe35d0ceef9a949b1c8906f9c814323ad7a1b6679d3f6b9d1717b8d54: Status 404 returned error can't find the container with id 100243abe35d0ceef9a949b1c8906f9c814323ad7a1b6679d3f6b9d1717b8d54 Dec 02 18:40:33 crc kubenswrapper[4878]: I1202 18:40:33.180269 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:40:33 crc kubenswrapper[4878]: I1202 18:40:33.922702 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd","Type":"ContainerStarted","Data":"15ec9813594ccb589e47e88428083dd7f55bc204c8b67348edf9736bd1a00151"} Dec 02 18:40:33 crc kubenswrapper[4878]: I1202 18:40:33.923337 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd","Type":"ContainerStarted","Data":"14f7cece6b5697d747ca8cf15ee1cb65a6386b9d78e0beecd6e48a6f7138bb4c"} Dec 02 18:40:33 crc kubenswrapper[4878]: I1202 18:40:33.923348 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd","Type":"ContainerStarted","Data":"100243abe35d0ceef9a949b1c8906f9c814323ad7a1b6679d3f6b9d1717b8d54"} Dec 02 18:40:33 crc kubenswrapper[4878]: I1202 18:40:33.944364 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.944344375 podStartE2EDuration="1.944344375s" podCreationTimestamp="2025-12-02 18:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:33.942191178 +0000 UTC m=+1543.631810059" watchObservedRunningTime="2025-12-02 18:40:33.944344375 +0000 UTC m=+1543.633963256" Dec 02 18:40:34 crc kubenswrapper[4878]: I1202 18:40:34.942775 4878 generic.go:334] "Generic (PLEG): container finished" podID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerID="7ece543cc46d2bcf68613e78d4d588c5caf1eeeb94490b86729156f34f24c52f" exitCode=0 Dec 02 18:40:34 crc kubenswrapper[4878]: I1202 18:40:34.950594 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerDied","Data":"7ece543cc46d2bcf68613e78d4d588c5caf1eeeb94490b86729156f34f24c52f"} Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.077978 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.097791 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-scripts\") pod \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.098334 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-combined-ca-bundle\") pod \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.098382 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-sg-core-conf-yaml\") pod \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.098492 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr2kk\" (UniqueName: \"kubernetes.io/projected/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-kube-api-access-xr2kk\") pod \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.098570 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-log-httpd\") pod \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.098806 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-config-data\") pod \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.098933 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-run-httpd\") pod \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\" (UID: \"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.101491 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" (UID: "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.105216 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-scripts" (OuterVolumeSpecName: "scripts") pod "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" (UID: "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.104751 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" (UID: "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.106612 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-kube-api-access-xr2kk" (OuterVolumeSpecName: "kube-api-access-xr2kk") pod "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" (UID: "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19"). InnerVolumeSpecName "kube-api-access-xr2kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.167527 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" (UID: "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.202652 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.202687 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.202697 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.202708 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr2kk\" (UniqueName: \"kubernetes.io/projected/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-kube-api-access-xr2kk\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.202717 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.236943 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" (UID: "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.255491 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-config-data" (OuterVolumeSpecName: "config-data") pod "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" (UID: "84786ceb-a9b8-46c5-9b7a-c7a3abd18b19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.304324 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.304367 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: E1202 18:40:35.353489 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a is running failed: container process not found" containerID="ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 18:40:35 crc kubenswrapper[4878]: E1202 18:40:35.385625 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a is running failed: container process not found" containerID="ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 18:40:35 crc kubenswrapper[4878]: E1202 18:40:35.386117 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a is running failed: container process not found" containerID="ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 18:40:35 crc kubenswrapper[4878]: E1202 18:40:35.386177 4878 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="cbbf6880-1398-4476-b0d0-d340f7231645" containerName="nova-scheduler-scheduler" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.658808 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.714871 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qgnf\" (UniqueName: \"kubernetes.io/projected/cbbf6880-1398-4476-b0d0-d340f7231645-kube-api-access-2qgnf\") pod \"cbbf6880-1398-4476-b0d0-d340f7231645\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.715006 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-combined-ca-bundle\") pod \"cbbf6880-1398-4476-b0d0-d340f7231645\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.715049 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-config-data\") pod \"cbbf6880-1398-4476-b0d0-d340f7231645\" (UID: \"cbbf6880-1398-4476-b0d0-d340f7231645\") " Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.733322 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbbf6880-1398-4476-b0d0-d340f7231645-kube-api-access-2qgnf" (OuterVolumeSpecName: "kube-api-access-2qgnf") pod "cbbf6880-1398-4476-b0d0-d340f7231645" (UID: "cbbf6880-1398-4476-b0d0-d340f7231645"). InnerVolumeSpecName "kube-api-access-2qgnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.779643 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-config-data" (OuterVolumeSpecName: "config-data") pod "cbbf6880-1398-4476-b0d0-d340f7231645" (UID: "cbbf6880-1398-4476-b0d0-d340f7231645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.789275 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbbf6880-1398-4476-b0d0-d340f7231645" (UID: "cbbf6880-1398-4476-b0d0-d340f7231645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.817855 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.817904 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbbf6880-1398-4476-b0d0-d340f7231645-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.817915 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qgnf\" (UniqueName: \"kubernetes.io/projected/cbbf6880-1398-4476-b0d0-d340f7231645-kube-api-access-2qgnf\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.956949 4878 generic.go:334] "Generic (PLEG): container finished" podID="cbbf6880-1398-4476-b0d0-d340f7231645" containerID="ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" exitCode=0 Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.957022 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbbf6880-1398-4476-b0d0-d340f7231645","Type":"ContainerDied","Data":"ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a"} Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.957056 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbbf6880-1398-4476-b0d0-d340f7231645","Type":"ContainerDied","Data":"ac49eb0a234473802caf6e51b08afe97e4d5ed5661330fb91096dab1e57fddc5"} Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.957074 4878 scope.go:117] "RemoveContainer" containerID="ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.957271 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.964514 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84786ceb-a9b8-46c5-9b7a-c7a3abd18b19","Type":"ContainerDied","Data":"adcb4e3fe946a97c2bf9ae3545126a5377a091c4ac71d379b38518915dd62001"} Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.964621 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.992399 4878 scope.go:117] "RemoveContainer" containerID="ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" Dec 02 18:40:35 crc kubenswrapper[4878]: E1202 18:40:35.995028 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a\": container with ID starting with ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a not found: ID does not exist" containerID="ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.995069 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a"} err="failed to get container status \"ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a\": rpc error: code = NotFound desc = could not find container \"ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a\": container with ID starting with ec4e50f4367ee2a9c5db68738f890879716269b99487ceb25acedd68ca5c4e8a not found: ID does not exist" Dec 02 18:40:35 crc kubenswrapper[4878]: I1202 18:40:35.995097 4878 scope.go:117] "RemoveContainer" containerID="8bfc16f4aa576478f10a317116d84307e1967c9da49cc336ebdc615fdf9e7a74" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.038139 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.069870 4878 scope.go:117] "RemoveContainer" containerID="85b18c21ba35742b2b11f4209d43400b0de25e8ac1ece7c40ef08faf02430cfe" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.070204 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.099313 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.122395 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.134170 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: E1202 18:40:36.135617 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="sg-core" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.135638 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="sg-core" Dec 02 18:40:36 crc kubenswrapper[4878]: E1202 18:40:36.135709 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-notification-agent" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.135719 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-notification-agent" Dec 02 18:40:36 crc kubenswrapper[4878]: E1202 18:40:36.135792 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="proxy-httpd" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.135800 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="proxy-httpd" Dec 02 18:40:36 crc kubenswrapper[4878]: E1202 18:40:36.135812 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbbf6880-1398-4476-b0d0-d340f7231645" containerName="nova-scheduler-scheduler" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.135819 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbbf6880-1398-4476-b0d0-d340f7231645" containerName="nova-scheduler-scheduler" Dec 02 18:40:36 crc kubenswrapper[4878]: E1202 18:40:36.135867 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-central-agent" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.135873 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-central-agent" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.136262 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbbf6880-1398-4476-b0d0-d340f7231645" containerName="nova-scheduler-scheduler" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.136346 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-central-agent" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.136420 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="ceilometer-notification-agent" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.136438 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="proxy-httpd" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.136497 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" containerName="sg-core" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.138101 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.142143 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.142800 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g96tp\" (UniqueName: \"kubernetes.io/projected/872e7309-ca28-4a96-98a8-359de3dfc613-kube-api-access-g96tp\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.143030 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-config-data\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.145681 4878 scope.go:117] "RemoveContainer" containerID="2a6593d91b422dc79f67891cd29f8ded1580ae110ad3f86841d09b8b168b0fd8" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.160284 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.203371 4878 scope.go:117] "RemoveContainer" containerID="7ece543cc46d2bcf68613e78d4d588c5caf1eeeb94490b86729156f34f24c52f" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.211412 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.227313 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.230918 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.237050 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.237374 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.237875 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245296 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g96tp\" (UniqueName: \"kubernetes.io/projected/872e7309-ca28-4a96-98a8-359de3dfc613-kube-api-access-g96tp\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245341 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-run-httpd\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245373 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncplz\" (UniqueName: \"kubernetes.io/projected/b2748089-13af-4695-a743-056baad129c3-kube-api-access-ncplz\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245503 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-config-data\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245529 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245555 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-config-data\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245593 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245621 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-log-httpd\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245762 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.245871 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-scripts\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.246452 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.253038 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-config-data\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.254655 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.263030 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.265207 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.301892 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g96tp\" (UniqueName: \"kubernetes.io/projected/872e7309-ca28-4a96-98a8-359de3dfc613-kube-api-access-g96tp\") pod \"nova-scheduler-0\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348503 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348777 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-run-httpd\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348806 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncplz\" (UniqueName: \"kubernetes.io/projected/b2748089-13af-4695-a743-056baad129c3-kube-api-access-ncplz\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348846 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-config-data\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348870 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348918 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-log-httpd\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348941 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.348973 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-scripts\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.351266 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-log-httpd\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.351904 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-run-httpd\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.354453 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-scripts\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.354638 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.354913 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.355798 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-config-data\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.357434 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.374317 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncplz\" (UniqueName: \"kubernetes.io/projected/b2748089-13af-4695-a743-056baad129c3-kube-api-access-ncplz\") pod \"ceilometer-0\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.485677 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.572005 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.980807 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84786ceb-a9b8-46c5-9b7a-c7a3abd18b19" path="/var/lib/kubelet/pods/84786ceb-a9b8-46c5-9b7a-c7a3abd18b19/volumes" Dec 02 18:40:36 crc kubenswrapper[4878]: I1202 18:40:36.983291 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbbf6880-1398-4476-b0d0-d340f7231645" path="/var/lib/kubelet/pods/cbbf6880-1398-4476-b0d0-d340f7231645/volumes" Dec 02 18:40:37 crc kubenswrapper[4878]: W1202 18:40:37.036534 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod872e7309_ca28_4a96_98a8_359de3dfc613.slice/crio-f87455a9f32a99cbc4920b5e41f7e83b4793e7f8c2bb0763af732f5d46908c78 WatchSource:0}: Error finding container f87455a9f32a99cbc4920b5e41f7e83b4793e7f8c2bb0763af732f5d46908c78: Status 404 returned error can't find the container with id f87455a9f32a99cbc4920b5e41f7e83b4793e7f8c2bb0763af732f5d46908c78 Dec 02 18:40:37 crc kubenswrapper[4878]: I1202 18:40:37.055399 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:40:37 crc kubenswrapper[4878]: I1202 18:40:37.169930 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:40:37 crc kubenswrapper[4878]: W1202 18:40:37.175617 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2748089_13af_4695_a743_056baad129c3.slice/crio-bbe59899da51e745502cb0e10f7625d8cc3fc8dab8ea2fe89e925e1631e4af02 WatchSource:0}: Error finding container bbe59899da51e745502cb0e10f7625d8cc3fc8dab8ea2fe89e925e1631e4af02: Status 404 returned error can't find the container with id bbe59899da51e745502cb0e10f7625d8cc3fc8dab8ea2fe89e925e1631e4af02 Dec 02 18:40:37 crc kubenswrapper[4878]: I1202 18:40:37.601065 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 18:40:37 crc kubenswrapper[4878]: I1202 18:40:37.602390 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.035761 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872e7309-ca28-4a96-98a8-359de3dfc613","Type":"ContainerStarted","Data":"3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f"} Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.035824 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872e7309-ca28-4a96-98a8-359de3dfc613","Type":"ContainerStarted","Data":"f87455a9f32a99cbc4920b5e41f7e83b4793e7f8c2bb0763af732f5d46908c78"} Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.038412 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerStarted","Data":"bbe59899da51e745502cb0e10f7625d8cc3fc8dab8ea2fe89e925e1631e4af02"} Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.060212 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.060165967 podStartE2EDuration="2.060165967s" podCreationTimestamp="2025-12-02 18:40:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:38.057665669 +0000 UTC m=+1547.747284580" watchObservedRunningTime="2025-12-02 18:40:38.060165967 +0000 UTC m=+1547.749784858" Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.861105 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.930691 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-config-data\") pod \"369791de-d9df-4032-a66f-859fae1cbb28\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.931145 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-combined-ca-bundle\") pod \"369791de-d9df-4032-a66f-859fae1cbb28\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.931368 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369791de-d9df-4032-a66f-859fae1cbb28-logs\") pod \"369791de-d9df-4032-a66f-859fae1cbb28\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.931449 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/369791de-d9df-4032-a66f-859fae1cbb28-kube-api-access-n2k7v\") pod \"369791de-d9df-4032-a66f-859fae1cbb28\" (UID: \"369791de-d9df-4032-a66f-859fae1cbb28\") " Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.933687 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/369791de-d9df-4032-a66f-859fae1cbb28-logs" (OuterVolumeSpecName: "logs") pod "369791de-d9df-4032-a66f-859fae1cbb28" (UID: "369791de-d9df-4032-a66f-859fae1cbb28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.944387 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369791de-d9df-4032-a66f-859fae1cbb28-kube-api-access-n2k7v" (OuterVolumeSpecName: "kube-api-access-n2k7v") pod "369791de-d9df-4032-a66f-859fae1cbb28" (UID: "369791de-d9df-4032-a66f-859fae1cbb28"). InnerVolumeSpecName "kube-api-access-n2k7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:38 crc kubenswrapper[4878]: I1202 18:40:38.981130 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "369791de-d9df-4032-a66f-859fae1cbb28" (UID: "369791de-d9df-4032-a66f-859fae1cbb28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.019440 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-config-data" (OuterVolumeSpecName: "config-data") pod "369791de-d9df-4032-a66f-859fae1cbb28" (UID: "369791de-d9df-4032-a66f-859fae1cbb28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.035170 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.035205 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/369791de-d9df-4032-a66f-859fae1cbb28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.035219 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/369791de-d9df-4032-a66f-859fae1cbb28-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.035228 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2k7v\" (UniqueName: \"kubernetes.io/projected/369791de-d9df-4032-a66f-859fae1cbb28-kube-api-access-n2k7v\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.057577 4878 generic.go:334] "Generic (PLEG): container finished" podID="369791de-d9df-4032-a66f-859fae1cbb28" containerID="4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade" exitCode=0 Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.058478 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.109947 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"369791de-d9df-4032-a66f-859fae1cbb28","Type":"ContainerDied","Data":"4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade"} Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.110008 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"369791de-d9df-4032-a66f-859fae1cbb28","Type":"ContainerDied","Data":"b657bca485decce35adb659e949a388b4083085ff02b140d573ea570ecc23648"} Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.110028 4878 scope.go:117] "RemoveContainer" containerID="4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.158575 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.168496 4878 scope.go:117] "RemoveContainer" containerID="44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.187668 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.203453 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:39 crc kubenswrapper[4878]: E1202 18:40:39.204171 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-api" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.204184 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-api" Dec 02 18:40:39 crc kubenswrapper[4878]: E1202 18:40:39.204212 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-log" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.204218 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-log" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.204481 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-log" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.204502 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="369791de-d9df-4032-a66f-859fae1cbb28" containerName="nova-api-api" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.205921 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.214128 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.241176 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.241734 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dc4ac14-dee9-4363-9540-ba11ad022b19-logs\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.241797 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2tzj\" (UniqueName: \"kubernetes.io/projected/0dc4ac14-dee9-4363-9540-ba11ad022b19-kube-api-access-l2tzj\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.241827 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-config-data\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.241904 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.315474 4878 scope.go:117] "RemoveContainer" containerID="4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade" Dec 02 18:40:39 crc kubenswrapper[4878]: E1202 18:40:39.315998 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade\": container with ID starting with 4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade not found: ID does not exist" containerID="4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.316052 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade"} err="failed to get container status \"4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade\": rpc error: code = NotFound desc = could not find container \"4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade\": container with ID starting with 4ac5d5a156e7bdc249352dcfd68f7e5e258384342bf737889657029e25fabade not found: ID does not exist" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.316095 4878 scope.go:117] "RemoveContainer" containerID="44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1" Dec 02 18:40:39 crc kubenswrapper[4878]: E1202 18:40:39.316826 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1\": container with ID starting with 44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1 not found: ID does not exist" containerID="44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.316862 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1"} err="failed to get container status \"44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1\": rpc error: code = NotFound desc = could not find container \"44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1\": container with ID starting with 44ec18be225657ae3d2140416ba9735e96c96130414f8e9b43343a167f351fe1 not found: ID does not exist" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.345362 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dc4ac14-dee9-4363-9540-ba11ad022b19-logs\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.345671 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2tzj\" (UniqueName: \"kubernetes.io/projected/0dc4ac14-dee9-4363-9540-ba11ad022b19-kube-api-access-l2tzj\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.345767 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-config-data\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.345945 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.347956 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dc4ac14-dee9-4363-9540-ba11ad022b19-logs\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.353456 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-config-data\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.354214 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.370182 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2tzj\" (UniqueName: \"kubernetes.io/projected/0dc4ac14-dee9-4363-9540-ba11ad022b19-kube-api-access-l2tzj\") pod \"nova-api-0\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " pod="openstack/nova-api-0" Dec 02 18:40:39 crc kubenswrapper[4878]: I1202 18:40:39.577476 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:40:40 crc kubenswrapper[4878]: I1202 18:40:40.088948 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 18:40:40 crc kubenswrapper[4878]: I1202 18:40:40.255691 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:40:40 crc kubenswrapper[4878]: I1202 18:40:40.956905 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369791de-d9df-4032-a66f-859fae1cbb28" path="/var/lib/kubelet/pods/369791de-d9df-4032-a66f-859fae1cbb28/volumes" Dec 02 18:40:41 crc kubenswrapper[4878]: I1202 18:40:41.103175 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerStarted","Data":"22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f"} Dec 02 18:40:41 crc kubenswrapper[4878]: I1202 18:40:41.103251 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerStarted","Data":"549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329"} Dec 02 18:40:41 crc kubenswrapper[4878]: I1202 18:40:41.106122 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0dc4ac14-dee9-4363-9540-ba11ad022b19","Type":"ContainerStarted","Data":"e6c0467562abc55de7c1d96ae4d04d1a6137361d8328cbe53c9478a124162c9f"} Dec 02 18:40:41 crc kubenswrapper[4878]: I1202 18:40:41.106200 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0dc4ac14-dee9-4363-9540-ba11ad022b19","Type":"ContainerStarted","Data":"ce706fa53febb974dc0c654c2da2ce00b1d5631385412ab80e91bdec050b60e1"} Dec 02 18:40:41 crc kubenswrapper[4878]: I1202 18:40:41.106210 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0dc4ac14-dee9-4363-9540-ba11ad022b19","Type":"ContainerStarted","Data":"74e7a7ad1cff913aef39331f7d6588254f39f71a75e58566e27d8a9095215bff"} Dec 02 18:40:41 crc kubenswrapper[4878]: I1202 18:40:41.131460 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.131429806 podStartE2EDuration="2.131429806s" podCreationTimestamp="2025-12-02 18:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:41.124610502 +0000 UTC m=+1550.814229383" watchObservedRunningTime="2025-12-02 18:40:41.131429806 +0000 UTC m=+1550.821048687" Dec 02 18:40:41 crc kubenswrapper[4878]: I1202 18:40:41.486620 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 18:40:42 crc kubenswrapper[4878]: I1202 18:40:42.123189 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerStarted","Data":"e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2"} Dec 02 18:40:42 crc kubenswrapper[4878]: I1202 18:40:42.601422 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 18:40:42 crc kubenswrapper[4878]: I1202 18:40:42.601686 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 18:40:43 crc kubenswrapper[4878]: I1202 18:40:43.619454 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 18:40:43 crc kubenswrapper[4878]: I1202 18:40:43.619530 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 18:40:44 crc kubenswrapper[4878]: I1202 18:40:44.152572 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerStarted","Data":"a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c"} Dec 02 18:40:44 crc kubenswrapper[4878]: I1202 18:40:44.153003 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:40:44 crc kubenswrapper[4878]: I1202 18:40:44.195495 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3744330639999998 podStartE2EDuration="8.195468879s" podCreationTimestamp="2025-12-02 18:40:36 +0000 UTC" firstStartedPulling="2025-12-02 18:40:37.178820934 +0000 UTC m=+1546.868439815" lastFinishedPulling="2025-12-02 18:40:42.999856749 +0000 UTC m=+1552.689475630" observedRunningTime="2025-12-02 18:40:44.178482027 +0000 UTC m=+1553.868100908" watchObservedRunningTime="2025-12-02 18:40:44.195468879 +0000 UTC m=+1553.885087770" Dec 02 18:40:45 crc kubenswrapper[4878]: I1202 18:40:45.971992 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bcvm5"] Dec 02 18:40:45 crc kubenswrapper[4878]: I1202 18:40:45.978626 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:45.998066 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcvm5"] Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.073688 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2h7f\" (UniqueName: \"kubernetes.io/projected/098668bc-e972-4c13-acfd-adfbed1f28c6-kube-api-access-q2h7f\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.074135 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-utilities\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.074854 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-catalog-content\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.180637 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-utilities\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.180928 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-catalog-content\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.181030 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2h7f\" (UniqueName: \"kubernetes.io/projected/098668bc-e972-4c13-acfd-adfbed1f28c6-kube-api-access-q2h7f\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.182160 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-utilities\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.182744 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-catalog-content\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.215990 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2h7f\" (UniqueName: \"kubernetes.io/projected/098668bc-e972-4c13-acfd-adfbed1f28c6-kube-api-access-q2h7f\") pod \"community-operators-bcvm5\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.315331 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.487046 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 18:40:46 crc kubenswrapper[4878]: I1202 18:40:46.560744 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 18:40:47 crc kubenswrapper[4878]: I1202 18:40:47.045410 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcvm5"] Dec 02 18:40:47 crc kubenswrapper[4878]: W1202 18:40:47.057512 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod098668bc_e972_4c13_acfd_adfbed1f28c6.slice/crio-676398bbb11bd300b370823cc56e3459f964bbb958baab2745a3fc811a38cc74 WatchSource:0}: Error finding container 676398bbb11bd300b370823cc56e3459f964bbb958baab2745a3fc811a38cc74: Status 404 returned error can't find the container with id 676398bbb11bd300b370823cc56e3459f964bbb958baab2745a3fc811a38cc74 Dec 02 18:40:47 crc kubenswrapper[4878]: I1202 18:40:47.209734 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm5" event={"ID":"098668bc-e972-4c13-acfd-adfbed1f28c6","Type":"ContainerStarted","Data":"676398bbb11bd300b370823cc56e3459f964bbb958baab2745a3fc811a38cc74"} Dec 02 18:40:47 crc kubenswrapper[4878]: I1202 18:40:47.254874 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 18:40:48 crc kubenswrapper[4878]: I1202 18:40:48.222755 4878 generic.go:334] "Generic (PLEG): container finished" podID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerID="1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2" exitCode=0 Dec 02 18:40:48 crc kubenswrapper[4878]: I1202 18:40:48.222853 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm5" event={"ID":"098668bc-e972-4c13-acfd-adfbed1f28c6","Type":"ContainerDied","Data":"1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2"} Dec 02 18:40:49 crc kubenswrapper[4878]: I1202 18:40:49.578533 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 18:40:49 crc kubenswrapper[4878]: I1202 18:40:49.578888 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 18:40:50 crc kubenswrapper[4878]: I1202 18:40:50.277179 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm5" event={"ID":"098668bc-e972-4c13-acfd-adfbed1f28c6","Type":"ContainerStarted","Data":"b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6"} Dec 02 18:40:50 crc kubenswrapper[4878]: I1202 18:40:50.663157 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.252:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 18:40:50 crc kubenswrapper[4878]: I1202 18:40:50.663284 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.252:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.305985 4878 generic.go:334] "Generic (PLEG): container finished" podID="da60f101-6061-4a42-9333-0c06d5f0e9b1" containerID="3ba806db5a3e223d7cede3e77dd56a6bcc72d2c6a1ddec1bf3736efb50760a00" exitCode=137 Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.306078 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"da60f101-6061-4a42-9333-0c06d5f0e9b1","Type":"ContainerDied","Data":"3ba806db5a3e223d7cede3e77dd56a6bcc72d2c6a1ddec1bf3736efb50760a00"} Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.310219 4878 generic.go:334] "Generic (PLEG): container finished" podID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerID="b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6" exitCode=0 Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.310304 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm5" event={"ID":"098668bc-e972-4c13-acfd-adfbed1f28c6","Type":"ContainerDied","Data":"b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6"} Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.769538 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.823741 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-config-data\") pod \"da60f101-6061-4a42-9333-0c06d5f0e9b1\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.824378 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-combined-ca-bundle\") pod \"da60f101-6061-4a42-9333-0c06d5f0e9b1\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.825210 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfzf7\" (UniqueName: \"kubernetes.io/projected/da60f101-6061-4a42-9333-0c06d5f0e9b1-kube-api-access-nfzf7\") pod \"da60f101-6061-4a42-9333-0c06d5f0e9b1\" (UID: \"da60f101-6061-4a42-9333-0c06d5f0e9b1\") " Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.835518 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da60f101-6061-4a42-9333-0c06d5f0e9b1-kube-api-access-nfzf7" (OuterVolumeSpecName: "kube-api-access-nfzf7") pod "da60f101-6061-4a42-9333-0c06d5f0e9b1" (UID: "da60f101-6061-4a42-9333-0c06d5f0e9b1"). InnerVolumeSpecName "kube-api-access-nfzf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.863686 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-config-data" (OuterVolumeSpecName: "config-data") pod "da60f101-6061-4a42-9333-0c06d5f0e9b1" (UID: "da60f101-6061-4a42-9333-0c06d5f0e9b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.864342 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da60f101-6061-4a42-9333-0c06d5f0e9b1" (UID: "da60f101-6061-4a42-9333-0c06d5f0e9b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.929337 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.929379 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfzf7\" (UniqueName: \"kubernetes.io/projected/da60f101-6061-4a42-9333-0c06d5f0e9b1-kube-api-access-nfzf7\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:51 crc kubenswrapper[4878]: I1202 18:40:51.929394 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da60f101-6061-4a42-9333-0c06d5f0e9b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.325994 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm5" event={"ID":"098668bc-e972-4c13-acfd-adfbed1f28c6","Type":"ContainerStarted","Data":"ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed"} Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.328983 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"da60f101-6061-4a42-9333-0c06d5f0e9b1","Type":"ContainerDied","Data":"7f5c90cdc211acae02247089c4afd439a65f3b37ea0d00d9ca5a1e7df4971dca"} Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.329039 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.329041 4878 scope.go:117] "RemoveContainer" containerID="3ba806db5a3e223d7cede3e77dd56a6bcc72d2c6a1ddec1bf3736efb50760a00" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.380754 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bcvm5" podStartSLOduration=3.913323513 podStartE2EDuration="7.380729458s" podCreationTimestamp="2025-12-02 18:40:45 +0000 UTC" firstStartedPulling="2025-12-02 18:40:48.248618631 +0000 UTC m=+1557.938237512" lastFinishedPulling="2025-12-02 18:40:51.716024576 +0000 UTC m=+1561.405643457" observedRunningTime="2025-12-02 18:40:52.350846467 +0000 UTC m=+1562.040465348" watchObservedRunningTime="2025-12-02 18:40:52.380729458 +0000 UTC m=+1562.070348339" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.390997 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.412323 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.440648 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:52 crc kubenswrapper[4878]: E1202 18:40:52.441220 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da60f101-6061-4a42-9333-0c06d5f0e9b1" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.441432 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="da60f101-6061-4a42-9333-0c06d5f0e9b1" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.441726 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="da60f101-6061-4a42-9333-0c06d5f0e9b1" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.442646 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.447665 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.447690 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.451071 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.459454 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.546282 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.546379 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.546422 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dkb\" (UniqueName: \"kubernetes.io/projected/855f3d94-a64e-4661-8a2d-30b33a682633-kube-api-access-74dkb\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.546453 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.546717 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.611412 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.611635 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.618180 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.620600 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.650947 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.651091 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.652158 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dkb\" (UniqueName: \"kubernetes.io/projected/855f3d94-a64e-4661-8a2d-30b33a682633-kube-api-access-74dkb\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.652255 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.652440 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.656475 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.656997 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.657781 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.673373 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f3d94-a64e-4661-8a2d-30b33a682633-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.691939 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dkb\" (UniqueName: \"kubernetes.io/projected/855f3d94-a64e-4661-8a2d-30b33a682633-kube-api-access-74dkb\") pod \"nova-cell1-novncproxy-0\" (UID: \"855f3d94-a64e-4661-8a2d-30b33a682633\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.775017 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:52 crc kubenswrapper[4878]: I1202 18:40:52.961014 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da60f101-6061-4a42-9333-0c06d5f0e9b1" path="/var/lib/kubelet/pods/da60f101-6061-4a42-9333-0c06d5f0e9b1/volumes" Dec 02 18:40:53 crc kubenswrapper[4878]: I1202 18:40:53.349383 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 18:40:54 crc kubenswrapper[4878]: I1202 18:40:54.368345 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"855f3d94-a64e-4661-8a2d-30b33a682633","Type":"ContainerStarted","Data":"c74a67290d81a1d3aa03154cf1be8353f6454cb03809ba1236da4f4976f4e4f4"} Dec 02 18:40:54 crc kubenswrapper[4878]: I1202 18:40:54.369005 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"855f3d94-a64e-4661-8a2d-30b33a682633","Type":"ContainerStarted","Data":"0c735e1450efda6f916bbaacece65fdcd97901ff04d505f16f855fe3232376d8"} Dec 02 18:40:54 crc kubenswrapper[4878]: I1202 18:40:54.401657 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.401637179 podStartE2EDuration="2.401637179s" podCreationTimestamp="2025-12-02 18:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:40:54.396743558 +0000 UTC m=+1564.086362459" watchObservedRunningTime="2025-12-02 18:40:54.401637179 +0000 UTC m=+1564.091256060" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.014800 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.142980 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-config-data\") pod \"03c74212-8bb7-45e3-8110-cf65a7288caf\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.143058 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpfgb\" (UniqueName: \"kubernetes.io/projected/03c74212-8bb7-45e3-8110-cf65a7288caf-kube-api-access-kpfgb\") pod \"03c74212-8bb7-45e3-8110-cf65a7288caf\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.143390 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-scripts\") pod \"03c74212-8bb7-45e3-8110-cf65a7288caf\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.143451 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-combined-ca-bundle\") pod \"03c74212-8bb7-45e3-8110-cf65a7288caf\" (UID: \"03c74212-8bb7-45e3-8110-cf65a7288caf\") " Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.155685 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c74212-8bb7-45e3-8110-cf65a7288caf-kube-api-access-kpfgb" (OuterVolumeSpecName: "kube-api-access-kpfgb") pod "03c74212-8bb7-45e3-8110-cf65a7288caf" (UID: "03c74212-8bb7-45e3-8110-cf65a7288caf"). InnerVolumeSpecName "kube-api-access-kpfgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.156447 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-scripts" (OuterVolumeSpecName: "scripts") pod "03c74212-8bb7-45e3-8110-cf65a7288caf" (UID: "03c74212-8bb7-45e3-8110-cf65a7288caf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.248680 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.248710 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpfgb\" (UniqueName: \"kubernetes.io/projected/03c74212-8bb7-45e3-8110-cf65a7288caf-kube-api-access-kpfgb\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.313678 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-config-data" (OuterVolumeSpecName: "config-data") pod "03c74212-8bb7-45e3-8110-cf65a7288caf" (UID: "03c74212-8bb7-45e3-8110-cf65a7288caf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.315928 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.315985 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.322472 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03c74212-8bb7-45e3-8110-cf65a7288caf" (UID: "03c74212-8bb7-45e3-8110-cf65a7288caf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.351953 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.351990 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c74212-8bb7-45e3-8110-cf65a7288caf-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.379079 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.403903 4878 generic.go:334] "Generic (PLEG): container finished" podID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerID="615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19" exitCode=137 Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.404857 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.405220 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerDied","Data":"615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19"} Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.405360 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"03c74212-8bb7-45e3-8110-cf65a7288caf","Type":"ContainerDied","Data":"4bf935943d30c1cd0f803f72341ef44c92717eec2fa8c25c11008a2fc171bab3"} Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.405385 4878 scope.go:117] "RemoveContainer" containerID="615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.445346 4878 scope.go:117] "RemoveContainer" containerID="588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.455762 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.476327 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.488798 4878 scope.go:117] "RemoveContainer" containerID="53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.498676 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.499340 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-evaluator" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499359 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-evaluator" Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.499386 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-api" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499392 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-api" Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.499429 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-listener" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499436 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-listener" Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.499462 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-notifier" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499470 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-notifier" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499697 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-notifier" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499719 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-api" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499746 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-listener" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.499757 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" containerName="aodh-evaluator" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.503937 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.509628 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.509867 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.510085 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.510329 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f79q6" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.510496 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.532996 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.562731 4878 scope.go:117] "RemoveContainer" containerID="93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.662669 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-internal-tls-certs\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.662864 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.662955 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-public-tls-certs\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.663072 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-config-data\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.664823 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-scripts\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.665105 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbxk\" (UniqueName: \"kubernetes.io/projected/ea2cae47-1c1a-408f-b391-3641eae02402-kube-api-access-xfbxk\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.669571 4878 scope.go:117] "RemoveContainer" containerID="615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19" Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.670300 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19\": container with ID starting with 615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19 not found: ID does not exist" containerID="615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.670332 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19"} err="failed to get container status \"615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19\": rpc error: code = NotFound desc = could not find container \"615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19\": container with ID starting with 615060e1af2dfe7fbd02b0d8cce6b59ed021604872494c07d588aefe025b7b19 not found: ID does not exist" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.670355 4878 scope.go:117] "RemoveContainer" containerID="588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88" Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.670741 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88\": container with ID starting with 588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88 not found: ID does not exist" containerID="588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.670769 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88"} err="failed to get container status \"588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88\": rpc error: code = NotFound desc = could not find container \"588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88\": container with ID starting with 588b83fede1aace8241b9039de9115d1211dd4865a8071f52c33d39b53355a88 not found: ID does not exist" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.670786 4878 scope.go:117] "RemoveContainer" containerID="53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98" Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.671110 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98\": container with ID starting with 53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98 not found: ID does not exist" containerID="53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.671160 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98"} err="failed to get container status \"53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98\": rpc error: code = NotFound desc = could not find container \"53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98\": container with ID starting with 53dafd16d64aa4851c8a415495e8215ec1788f1e234a098ddcf8b1f2e59a0e98 not found: ID does not exist" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.671189 4878 scope.go:117] "RemoveContainer" containerID="93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601" Dec 02 18:40:56 crc kubenswrapper[4878]: E1202 18:40:56.671685 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601\": container with ID starting with 93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601 not found: ID does not exist" containerID="93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.671717 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601"} err="failed to get container status \"93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601\": rpc error: code = NotFound desc = could not find container \"93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601\": container with ID starting with 93e46e345d332d1966e0deedca63a1b6e3abdade5eea210de41d63721339a601 not found: ID does not exist" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.768142 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-scripts\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.768293 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbxk\" (UniqueName: \"kubernetes.io/projected/ea2cae47-1c1a-408f-b391-3641eae02402-kube-api-access-xfbxk\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.768373 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-internal-tls-certs\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.768465 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.768527 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-public-tls-certs\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.768624 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-config-data\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.776388 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-scripts\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.777038 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.778486 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-public-tls-certs\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.778941 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-internal-tls-certs\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.780102 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-config-data\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.794469 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbxk\" (UniqueName: \"kubernetes.io/projected/ea2cae47-1c1a-408f-b391-3641eae02402-kube-api-access-xfbxk\") pod \"aodh-0\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.914766 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:40:56 crc kubenswrapper[4878]: I1202 18:40:56.954182 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03c74212-8bb7-45e3-8110-cf65a7288caf" path="/var/lib/kubelet/pods/03c74212-8bb7-45e3-8110-cf65a7288caf/volumes" Dec 02 18:40:57 crc kubenswrapper[4878]: W1202 18:40:57.503527 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea2cae47_1c1a_408f_b391_3641eae02402.slice/crio-15a37a81b0defc0ef48fb47014118e14b5981d42a04b9c61610c825ffcd9d221 WatchSource:0}: Error finding container 15a37a81b0defc0ef48fb47014118e14b5981d42a04b9c61610c825ffcd9d221: Status 404 returned error can't find the container with id 15a37a81b0defc0ef48fb47014118e14b5981d42a04b9c61610c825ffcd9d221 Dec 02 18:40:57 crc kubenswrapper[4878]: I1202 18:40:57.508069 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 18:40:57 crc kubenswrapper[4878]: I1202 18:40:57.776228 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:40:58 crc kubenswrapper[4878]: I1202 18:40:58.435777 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerStarted","Data":"eb3c639a21dbe6bd8568cabf49415e365cb5dbcd4fbbbc27cb68186ba16f1fc3"} Dec 02 18:40:58 crc kubenswrapper[4878]: I1202 18:40:58.436459 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerStarted","Data":"15a37a81b0defc0ef48fb47014118e14b5981d42a04b9c61610c825ffcd9d221"} Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.452151 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerStarted","Data":"2173816ea965e2df9673508cf3b7318be20d81bcfaacff2d38b83bb175088dcf"} Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.582012 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.582461 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.582739 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.582794 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.586418 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.589384 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.840741 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8f58d9b47-c9bc9"] Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.846496 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.903959 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8f58d9b47-c9bc9"] Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.913740 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-sb\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.913786 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-nb\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.913885 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-svc\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.914430 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8p4g\" (UniqueName: \"kubernetes.io/projected/45e92901-83f8-42f0-8bfb-ff6cb1805d81-kube-api-access-p8p4g\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.914591 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-config\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:40:59 crc kubenswrapper[4878]: I1202 18:40:59.914613 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-swift-storage-0\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.017780 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8p4g\" (UniqueName: \"kubernetes.io/projected/45e92901-83f8-42f0-8bfb-ff6cb1805d81-kube-api-access-p8p4g\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.018186 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-config\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.018226 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-swift-storage-0\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.018276 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-sb\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.018301 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-nb\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.018399 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-svc\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.019750 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-sb\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.020815 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-nb\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.021808 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-config\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.021829 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-svc\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.021878 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-swift-storage-0\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.060114 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8p4g\" (UniqueName: \"kubernetes.io/projected/45e92901-83f8-42f0-8bfb-ff6cb1805d81-kube-api-access-p8p4g\") pod \"dnsmasq-dns-8f58d9b47-c9bc9\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.208734 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.494065 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerStarted","Data":"b56f8908b788f3b4107f06976ae35147a9be3aa0d598a261f4a3013513d7fc18"} Dec 02 18:41:00 crc kubenswrapper[4878]: I1202 18:41:00.892912 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8f58d9b47-c9bc9"] Dec 02 18:41:00 crc kubenswrapper[4878]: W1202 18:41:00.908396 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e92901_83f8_42f0_8bfb_ff6cb1805d81.slice/crio-185293ca4651006b6d60c10e741a8b61408f2d06769b0842dfd708d90c8f26de WatchSource:0}: Error finding container 185293ca4651006b6d60c10e741a8b61408f2d06769b0842dfd708d90c8f26de: Status 404 returned error can't find the container with id 185293ca4651006b6d60c10e741a8b61408f2d06769b0842dfd708d90c8f26de Dec 02 18:41:01 crc kubenswrapper[4878]: I1202 18:41:01.523107 4878 generic.go:334] "Generic (PLEG): container finished" podID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerID="e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d" exitCode=0 Dec 02 18:41:01 crc kubenswrapper[4878]: I1202 18:41:01.523376 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" event={"ID":"45e92901-83f8-42f0-8bfb-ff6cb1805d81","Type":"ContainerDied","Data":"e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d"} Dec 02 18:41:01 crc kubenswrapper[4878]: I1202 18:41:01.523613 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" event={"ID":"45e92901-83f8-42f0-8bfb-ff6cb1805d81","Type":"ContainerStarted","Data":"185293ca4651006b6d60c10e741a8b61408f2d06769b0842dfd708d90c8f26de"} Dec 02 18:41:01 crc kubenswrapper[4878]: I1202 18:41:01.561022 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerStarted","Data":"b8114f01b1b2e1e08f3a2b037fb46c96a55a24484d3bdd39991c68eef518f8db"} Dec 02 18:41:01 crc kubenswrapper[4878]: I1202 18:41:01.605952 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.710983664 podStartE2EDuration="5.605932486s" podCreationTimestamp="2025-12-02 18:40:56 +0000 UTC" firstStartedPulling="2025-12-02 18:40:57.506622237 +0000 UTC m=+1567.196241108" lastFinishedPulling="2025-12-02 18:41:00.401571049 +0000 UTC m=+1570.091189930" observedRunningTime="2025-12-02 18:41:01.595036931 +0000 UTC m=+1571.284655802" watchObservedRunningTime="2025-12-02 18:41:01.605932486 +0000 UTC m=+1571.295551367" Dec 02 18:41:02 crc kubenswrapper[4878]: I1202 18:41:02.618328 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" event={"ID":"45e92901-83f8-42f0-8bfb-ff6cb1805d81","Type":"ContainerStarted","Data":"35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb"} Dec 02 18:41:02 crc kubenswrapper[4878]: I1202 18:41:02.619317 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:02 crc kubenswrapper[4878]: I1202 18:41:02.673840 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" podStartSLOduration=3.673823691 podStartE2EDuration="3.673823691s" podCreationTimestamp="2025-12-02 18:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:41:02.663350482 +0000 UTC m=+1572.352969363" watchObservedRunningTime="2025-12-02 18:41:02.673823691 +0000 UTC m=+1572.363442572" Dec 02 18:41:02 crc kubenswrapper[4878]: I1202 18:41:02.776642 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:41:02 crc kubenswrapper[4878]: I1202 18:41:02.816704 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.198657 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.200756 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-log" containerID="cri-o://ce706fa53febb974dc0c654c2da2ce00b1d5631385412ab80e91bdec050b60e1" gracePeriod=30 Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.202008 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-api" containerID="cri-o://e6c0467562abc55de7c1d96ae4d04d1a6137361d8328cbe53c9478a124162c9f" gracePeriod=30 Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.657569 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.899863 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6697v"] Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.901839 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.907514 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.911637 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 18:41:03 crc kubenswrapper[4878]: I1202 18:41:03.912659 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6697v"] Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.025550 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-scripts\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.025961 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-config-data\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.026292 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4sz\" (UniqueName: \"kubernetes.io/projected/f78c0caa-65ba-4a70-a14d-067faf81a1fa-kube-api-access-4l4sz\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.026380 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.141197 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4sz\" (UniqueName: \"kubernetes.io/projected/f78c0caa-65ba-4a70-a14d-067faf81a1fa-kube-api-access-4l4sz\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.141572 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.143508 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-scripts\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.143750 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-config-data\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.161082 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.161176 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-scripts\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.161788 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-config-data\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.175543 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4sz\" (UniqueName: \"kubernetes.io/projected/f78c0caa-65ba-4a70-a14d-067faf81a1fa-kube-api-access-4l4sz\") pod \"nova-cell1-cell-mapping-6697v\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.243690 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.676768 4878 generic.go:334] "Generic (PLEG): container finished" podID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerID="ce706fa53febb974dc0c654c2da2ce00b1d5631385412ab80e91bdec050b60e1" exitCode=143 Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.676856 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0dc4ac14-dee9-4363-9540-ba11ad022b19","Type":"ContainerDied","Data":"ce706fa53febb974dc0c654c2da2ce00b1d5631385412ab80e91bdec050b60e1"} Dec 02 18:41:04 crc kubenswrapper[4878]: I1202 18:41:04.847269 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6697v"] Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.696590 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6697v" event={"ID":"f78c0caa-65ba-4a70-a14d-067faf81a1fa","Type":"ContainerStarted","Data":"f07e4edfb83891c4e9b72db710b90e724310d12f096e9b918491ad8dbbcc45a2"} Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.697090 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6697v" event={"ID":"f78c0caa-65ba-4a70-a14d-067faf81a1fa","Type":"ContainerStarted","Data":"328045402f7801afc9bb1c05b528d897c0f70da4dc671375ea7e2674f95e29d3"} Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.726397 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6697v" podStartSLOduration=2.726368639 podStartE2EDuration="2.726368639s" podCreationTimestamp="2025-12-02 18:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:41:05.72480727 +0000 UTC m=+1575.414426151" watchObservedRunningTime="2025-12-02 18:41:05.726368639 +0000 UTC m=+1575.415987540" Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.784233 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.784600 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-central-agent" containerID="cri-o://549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329" gracePeriod=30 Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.785254 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="proxy-httpd" containerID="cri-o://a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c" gracePeriod=30 Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.785440 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-notification-agent" containerID="cri-o://22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f" gracePeriod=30 Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.785500 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="sg-core" containerID="cri-o://e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2" gracePeriod=30 Dec 02 18:41:05 crc kubenswrapper[4878]: I1202 18:41:05.797949 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.251:3000/\": EOF" Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.372038 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.438634 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcvm5"] Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.574005 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.251:3000/\": dial tcp 10.217.0.251:3000: connect: connection refused" Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.718441 4878 generic.go:334] "Generic (PLEG): container finished" podID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerID="e6c0467562abc55de7c1d96ae4d04d1a6137361d8328cbe53c9478a124162c9f" exitCode=0 Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.718506 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0dc4ac14-dee9-4363-9540-ba11ad022b19","Type":"ContainerDied","Data":"e6c0467562abc55de7c1d96ae4d04d1a6137361d8328cbe53c9478a124162c9f"} Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.723491 4878 generic.go:334] "Generic (PLEG): container finished" podID="b2748089-13af-4695-a743-056baad129c3" containerID="a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c" exitCode=0 Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.723515 4878 generic.go:334] "Generic (PLEG): container finished" podID="b2748089-13af-4695-a743-056baad129c3" containerID="e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2" exitCode=2 Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.723526 4878 generic.go:334] "Generic (PLEG): container finished" podID="b2748089-13af-4695-a743-056baad129c3" containerID="549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329" exitCode=0 Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.723567 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerDied","Data":"a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c"} Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.723639 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerDied","Data":"e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2"} Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.723660 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerDied","Data":"549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329"} Dec 02 18:41:06 crc kubenswrapper[4878]: I1202 18:41:06.723729 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bcvm5" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="registry-server" containerID="cri-o://ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed" gracePeriod=2 Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.084988 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.190447 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-config-data\") pod \"0dc4ac14-dee9-4363-9540-ba11ad022b19\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.190574 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2tzj\" (UniqueName: \"kubernetes.io/projected/0dc4ac14-dee9-4363-9540-ba11ad022b19-kube-api-access-l2tzj\") pod \"0dc4ac14-dee9-4363-9540-ba11ad022b19\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.190682 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-combined-ca-bundle\") pod \"0dc4ac14-dee9-4363-9540-ba11ad022b19\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.190765 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dc4ac14-dee9-4363-9540-ba11ad022b19-logs\") pod \"0dc4ac14-dee9-4363-9540-ba11ad022b19\" (UID: \"0dc4ac14-dee9-4363-9540-ba11ad022b19\") " Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.192311 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dc4ac14-dee9-4363-9540-ba11ad022b19-logs" (OuterVolumeSpecName: "logs") pod "0dc4ac14-dee9-4363-9540-ba11ad022b19" (UID: "0dc4ac14-dee9-4363-9540-ba11ad022b19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.207032 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc4ac14-dee9-4363-9540-ba11ad022b19-kube-api-access-l2tzj" (OuterVolumeSpecName: "kube-api-access-l2tzj") pod "0dc4ac14-dee9-4363-9540-ba11ad022b19" (UID: "0dc4ac14-dee9-4363-9540-ba11ad022b19"). InnerVolumeSpecName "kube-api-access-l2tzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.230665 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-config-data" (OuterVolumeSpecName: "config-data") pod "0dc4ac14-dee9-4363-9540-ba11ad022b19" (UID: "0dc4ac14-dee9-4363-9540-ba11ad022b19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.233567 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dc4ac14-dee9-4363-9540-ba11ad022b19" (UID: "0dc4ac14-dee9-4363-9540-ba11ad022b19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.294514 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.294557 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2tzj\" (UniqueName: \"kubernetes.io/projected/0dc4ac14-dee9-4363-9540-ba11ad022b19-kube-api-access-l2tzj\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.294569 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc4ac14-dee9-4363-9540-ba11ad022b19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.294578 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dc4ac14-dee9-4363-9540-ba11ad022b19-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.378848 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.502190 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2h7f\" (UniqueName: \"kubernetes.io/projected/098668bc-e972-4c13-acfd-adfbed1f28c6-kube-api-access-q2h7f\") pod \"098668bc-e972-4c13-acfd-adfbed1f28c6\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.502323 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-utilities\") pod \"098668bc-e972-4c13-acfd-adfbed1f28c6\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.502384 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-catalog-content\") pod \"098668bc-e972-4c13-acfd-adfbed1f28c6\" (UID: \"098668bc-e972-4c13-acfd-adfbed1f28c6\") " Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.503905 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-utilities" (OuterVolumeSpecName: "utilities") pod "098668bc-e972-4c13-acfd-adfbed1f28c6" (UID: "098668bc-e972-4c13-acfd-adfbed1f28c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.508656 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/098668bc-e972-4c13-acfd-adfbed1f28c6-kube-api-access-q2h7f" (OuterVolumeSpecName: "kube-api-access-q2h7f") pod "098668bc-e972-4c13-acfd-adfbed1f28c6" (UID: "098668bc-e972-4c13-acfd-adfbed1f28c6"). InnerVolumeSpecName "kube-api-access-q2h7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.555954 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "098668bc-e972-4c13-acfd-adfbed1f28c6" (UID: "098668bc-e972-4c13-acfd-adfbed1f28c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.606036 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2h7f\" (UniqueName: \"kubernetes.io/projected/098668bc-e972-4c13-acfd-adfbed1f28c6-kube-api-access-q2h7f\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.606344 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.606360 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/098668bc-e972-4c13-acfd-adfbed1f28c6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.738085 4878 generic.go:334] "Generic (PLEG): container finished" podID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerID="ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed" exitCode=0 Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.738162 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm5" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.738175 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm5" event={"ID":"098668bc-e972-4c13-acfd-adfbed1f28c6","Type":"ContainerDied","Data":"ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed"} Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.738253 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm5" event={"ID":"098668bc-e972-4c13-acfd-adfbed1f28c6","Type":"ContainerDied","Data":"676398bbb11bd300b370823cc56e3459f964bbb958baab2745a3fc811a38cc74"} Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.738276 4878 scope.go:117] "RemoveContainer" containerID="ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.743042 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0dc4ac14-dee9-4363-9540-ba11ad022b19","Type":"ContainerDied","Data":"74e7a7ad1cff913aef39331f7d6588254f39f71a75e58566e27d8a9095215bff"} Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.743174 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.785585 4878 scope.go:117] "RemoveContainer" containerID="b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.787041 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcvm5"] Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.801646 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bcvm5"] Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.813220 4878 scope.go:117] "RemoveContainer" containerID="1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.814322 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.832036 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.873397 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.875414 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-api" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.875563 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-api" Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.875643 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-log" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.875694 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-log" Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.875768 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="extract-content" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.875824 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="extract-content" Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.875882 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="extract-utilities" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.875938 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="extract-utilities" Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.876015 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="registry-server" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.876073 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="registry-server" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.876370 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" containerName="registry-server" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.876458 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-log" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.876530 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" containerName="nova-api-api" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.878034 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.887076 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.888619 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.889440 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.906966 4878 scope.go:117] "RemoveContainer" containerID="ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed" Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.908100 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed\": container with ID starting with ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed not found: ID does not exist" containerID="ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.908136 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed"} err="failed to get container status \"ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed\": rpc error: code = NotFound desc = could not find container \"ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed\": container with ID starting with ff240b95abc00477dbe4b9d567227848be2ad1986ebab3f26a021e842fa10aed not found: ID does not exist" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.908160 4878 scope.go:117] "RemoveContainer" containerID="b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6" Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.908528 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6\": container with ID starting with b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6 not found: ID does not exist" containerID="b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.908553 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6"} err="failed to get container status \"b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6\": rpc error: code = NotFound desc = could not find container \"b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6\": container with ID starting with b7dc6cf97c62c0c4d6d9e81229b40bf476843e5f1830d9e4428ddf784f162ab6 not found: ID does not exist" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.908567 4878 scope.go:117] "RemoveContainer" containerID="1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2" Dec 02 18:41:07 crc kubenswrapper[4878]: E1202 18:41:07.908899 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2\": container with ID starting with 1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2 not found: ID does not exist" containerID="1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.908927 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2"} err="failed to get container status \"1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2\": rpc error: code = NotFound desc = could not find container \"1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2\": container with ID starting with 1f96122b407ddb8a960005ba520f13f5af29709105bd1cb5e1b1f34d0a88f6d2 not found: ID does not exist" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.908940 4878 scope.go:117] "RemoveContainer" containerID="e6c0467562abc55de7c1d96ae4d04d1a6137361d8328cbe53c9478a124162c9f" Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.910768 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:07 crc kubenswrapper[4878]: I1202 18:41:07.956500 4878 scope.go:117] "RemoveContainer" containerID="ce706fa53febb974dc0c654c2da2ce00b1d5631385412ab80e91bdec050b60e1" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.024633 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.024671 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.024744 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2djkp\" (UniqueName: \"kubernetes.io/projected/2de87751-5aa7-4181-8593-9f7f3cfba1a0-kube-api-access-2djkp\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.024785 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-config-data\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.024829 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de87751-5aa7-4181-8593-9f7f3cfba1a0-logs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.024847 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.129776 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.130061 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.130231 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2djkp\" (UniqueName: \"kubernetes.io/projected/2de87751-5aa7-4181-8593-9f7f3cfba1a0-kube-api-access-2djkp\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.130373 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-config-data\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.130896 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de87751-5aa7-4181-8593-9f7f3cfba1a0-logs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.130993 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.131840 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de87751-5aa7-4181-8593-9f7f3cfba1a0-logs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.134266 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.134587 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.143469 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-config-data\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.150758 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.153939 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2djkp\" (UniqueName: \"kubernetes.io/projected/2de87751-5aa7-4181-8593-9f7f3cfba1a0-kube-api-access-2djkp\") pod \"nova-api-0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.209199 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:08 crc kubenswrapper[4878]: E1202 18:41:08.365688 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2748089_13af_4695_a743_056baad129c3.slice/crio-conmon-22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2748089_13af_4695_a743_056baad129c3.slice/crio-22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.620466 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.747728 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-config-data\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.749185 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-ceilometer-tls-certs\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.749320 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-combined-ca-bundle\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.749415 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-run-httpd\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.749645 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncplz\" (UniqueName: \"kubernetes.io/projected/b2748089-13af-4695-a743-056baad129c3-kube-api-access-ncplz\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.749791 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-scripts\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.749844 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.749953 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-sg-core-conf-yaml\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.750022 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-log-httpd\") pod \"b2748089-13af-4695-a743-056baad129c3\" (UID: \"b2748089-13af-4695-a743-056baad129c3\") " Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.751126 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.752183 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.752268 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b2748089-13af-4695-a743-056baad129c3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.755540 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-scripts" (OuterVolumeSpecName: "scripts") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.755871 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2748089-13af-4695-a743-056baad129c3-kube-api-access-ncplz" (OuterVolumeSpecName: "kube-api-access-ncplz") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "kube-api-access-ncplz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.763895 4878 generic.go:334] "Generic (PLEG): container finished" podID="b2748089-13af-4695-a743-056baad129c3" containerID="22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f" exitCode=0 Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.763955 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerDied","Data":"22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f"} Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.764000 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b2748089-13af-4695-a743-056baad129c3","Type":"ContainerDied","Data":"bbe59899da51e745502cb0e10f7625d8cc3fc8dab8ea2fe89e925e1631e4af02"} Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.764024 4878 scope.go:117] "RemoveContainer" containerID="a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.764274 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.794498 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.807105 4878 scope.go:117] "RemoveContainer" containerID="e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.810690 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:08 crc kubenswrapper[4878]: W1202 18:41:08.822206 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2de87751_5aa7_4181_8593_9f7f3cfba1a0.slice/crio-207e99b576f2a1d6460d06eae16484b342e30c73de635735146050d1e11db37c WatchSource:0}: Error finding container 207e99b576f2a1d6460d06eae16484b342e30c73de635735146050d1e11db37c: Status 404 returned error can't find the container with id 207e99b576f2a1d6460d06eae16484b342e30c73de635735146050d1e11db37c Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.836393 4878 scope.go:117] "RemoveContainer" containerID="22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.844771 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.853385 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.853404 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.853414 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncplz\" (UniqueName: \"kubernetes.io/projected/b2748089-13af-4695-a743-056baad129c3-kube-api-access-ncplz\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.853423 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.856371 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.876801 4878 scope.go:117] "RemoveContainer" containerID="549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.909762 4878 scope.go:117] "RemoveContainer" containerID="a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c" Dec 02 18:41:08 crc kubenswrapper[4878]: E1202 18:41:08.914453 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c\": container with ID starting with a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c not found: ID does not exist" containerID="a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.914509 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c"} err="failed to get container status \"a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c\": rpc error: code = NotFound desc = could not find container \"a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c\": container with ID starting with a388a568068aeacba95616e5d55e202c6f5b48e4357b17b7ca539079cd75b10c not found: ID does not exist" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.914533 4878 scope.go:117] "RemoveContainer" containerID="e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2" Dec 02 18:41:08 crc kubenswrapper[4878]: E1202 18:41:08.915329 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2\": container with ID starting with e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2 not found: ID does not exist" containerID="e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.915348 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2"} err="failed to get container status \"e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2\": rpc error: code = NotFound desc = could not find container \"e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2\": container with ID starting with e51237e8f20d424581a377c02e0ec2ee702c5b0d08c6eed4bbd947de9241c9c2 not found: ID does not exist" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.915363 4878 scope.go:117] "RemoveContainer" containerID="22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f" Dec 02 18:41:08 crc kubenswrapper[4878]: E1202 18:41:08.915693 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f\": container with ID starting with 22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f not found: ID does not exist" containerID="22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.915715 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f"} err="failed to get container status \"22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f\": rpc error: code = NotFound desc = could not find container \"22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f\": container with ID starting with 22f7d295543d6e0535c2f2063b08f0ac6ed501fcafa2307ce39899f08633528f not found: ID does not exist" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.915730 4878 scope.go:117] "RemoveContainer" containerID="549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329" Dec 02 18:41:08 crc kubenswrapper[4878]: E1202 18:41:08.915930 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329\": container with ID starting with 549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329 not found: ID does not exist" containerID="549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.915951 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329"} err="failed to get container status \"549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329\": rpc error: code = NotFound desc = could not find container \"549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329\": container with ID starting with 549a0e87697b7c493198e8b8157e2f4096500e84c1ab424c8670767811a74329 not found: ID does not exist" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.918342 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-config-data" (OuterVolumeSpecName: "config-data") pod "b2748089-13af-4695-a743-056baad129c3" (UID: "b2748089-13af-4695-a743-056baad129c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.954686 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.955320 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2748089-13af-4695-a743-056baad129c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.967787 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="098668bc-e972-4c13-acfd-adfbed1f28c6" path="/var/lib/kubelet/pods/098668bc-e972-4c13-acfd-adfbed1f28c6/volumes" Dec 02 18:41:08 crc kubenswrapper[4878]: I1202 18:41:08.969978 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc4ac14-dee9-4363-9540-ba11ad022b19" path="/var/lib/kubelet/pods/0dc4ac14-dee9-4363-9540-ba11ad022b19/volumes" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.107978 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.130293 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.145915 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:09 crc kubenswrapper[4878]: E1202 18:41:09.155076 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="proxy-httpd" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.158914 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="proxy-httpd" Dec 02 18:41:09 crc kubenswrapper[4878]: E1202 18:41:09.158955 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-notification-agent" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.158962 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-notification-agent" Dec 02 18:41:09 crc kubenswrapper[4878]: E1202 18:41:09.158994 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-central-agent" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.159002 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-central-agent" Dec 02 18:41:09 crc kubenswrapper[4878]: E1202 18:41:09.159041 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="sg-core" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.159047 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="sg-core" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.159497 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-notification-agent" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.159535 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="proxy-httpd" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.159544 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="sg-core" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.159564 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2748089-13af-4695-a743-056baad129c3" containerName="ceilometer-central-agent" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.167097 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.167265 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.171956 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.172292 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.176540 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.365412 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-scripts\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.365511 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-run-httpd\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.365601 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.366328 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.366475 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.366594 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjcq7\" (UniqueName: \"kubernetes.io/projected/e1c8644c-f198-4162-9d8c-45104c4ab84e-kube-api-access-mjcq7\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.366687 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-log-httpd\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.366754 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-config-data\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.468983 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-scripts\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.469393 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-run-httpd\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.469470 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.469525 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.469596 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.469658 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjcq7\" (UniqueName: \"kubernetes.io/projected/e1c8644c-f198-4162-9d8c-45104c4ab84e-kube-api-access-mjcq7\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.469701 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-log-httpd\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.469732 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-config-data\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.470988 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-run-httpd\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.471006 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-log-httpd\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.474675 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.474871 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-config-data\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.475506 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.475992 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.476194 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-scripts\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.492747 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjcq7\" (UniqueName: \"kubernetes.io/projected/e1c8644c-f198-4162-9d8c-45104c4ab84e-kube-api-access-mjcq7\") pod \"ceilometer-0\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.638215 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.787300 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2de87751-5aa7-4181-8593-9f7f3cfba1a0","Type":"ContainerStarted","Data":"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193"} Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.787632 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2de87751-5aa7-4181-8593-9f7f3cfba1a0","Type":"ContainerStarted","Data":"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc"} Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.787648 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2de87751-5aa7-4181-8593-9f7f3cfba1a0","Type":"ContainerStarted","Data":"207e99b576f2a1d6460d06eae16484b342e30c73de635735146050d1e11db37c"} Dec 02 18:41:09 crc kubenswrapper[4878]: I1202 18:41:09.826827 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.826800656 podStartE2EDuration="2.826800656s" podCreationTimestamp="2025-12-02 18:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:41:09.820417849 +0000 UTC m=+1579.510036750" watchObservedRunningTime="2025-12-02 18:41:09.826800656 +0000 UTC m=+1579.516419547" Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.191795 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:10 crc kubenswrapper[4878]: W1202 18:41:10.207082 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c8644c_f198_4162_9d8c_45104c4ab84e.slice/crio-41aa30ef882c17b717c59f7a0ec786ac45f252ee60481640eaebbb3474d40051 WatchSource:0}: Error finding container 41aa30ef882c17b717c59f7a0ec786ac45f252ee60481640eaebbb3474d40051: Status 404 returned error can't find the container with id 41aa30ef882c17b717c59f7a0ec786ac45f252ee60481640eaebbb3474d40051 Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.210431 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.375029 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758d7bc895-6l69g"] Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.375377 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" containerName="dnsmasq-dns" containerID="cri-o://0932c8c715307b869e16674e1569432e6dedd6e9a2f8ed169b4f99ff68b55052" gracePeriod=10 Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.651351 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.242:5353: connect: connection refused" Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.809848 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerStarted","Data":"41aa30ef882c17b717c59f7a0ec786ac45f252ee60481640eaebbb3474d40051"} Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.825225 4878 generic.go:334] "Generic (PLEG): container finished" podID="6305987e-e2c7-4f75-b080-9bed005f003f" containerID="0932c8c715307b869e16674e1569432e6dedd6e9a2f8ed169b4f99ff68b55052" exitCode=0 Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.826734 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" event={"ID":"6305987e-e2c7-4f75-b080-9bed005f003f","Type":"ContainerDied","Data":"0932c8c715307b869e16674e1569432e6dedd6e9a2f8ed169b4f99ff68b55052"} Dec 02 18:41:10 crc kubenswrapper[4878]: I1202 18:41:10.963560 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2748089-13af-4695-a743-056baad129c3" path="/var/lib/kubelet/pods/b2748089-13af-4695-a743-056baad129c3/volumes" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.206096 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.229995 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-config\") pod \"6305987e-e2c7-4f75-b080-9bed005f003f\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.230047 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-nb\") pod \"6305987e-e2c7-4f75-b080-9bed005f003f\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.230131 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkzx2\" (UniqueName: \"kubernetes.io/projected/6305987e-e2c7-4f75-b080-9bed005f003f-kube-api-access-pkzx2\") pod \"6305987e-e2c7-4f75-b080-9bed005f003f\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.230229 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-swift-storage-0\") pod \"6305987e-e2c7-4f75-b080-9bed005f003f\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.230268 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-sb\") pod \"6305987e-e2c7-4f75-b080-9bed005f003f\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.230299 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-svc\") pod \"6305987e-e2c7-4f75-b080-9bed005f003f\" (UID: \"6305987e-e2c7-4f75-b080-9bed005f003f\") " Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.239452 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6305987e-e2c7-4f75-b080-9bed005f003f-kube-api-access-pkzx2" (OuterVolumeSpecName: "kube-api-access-pkzx2") pod "6305987e-e2c7-4f75-b080-9bed005f003f" (UID: "6305987e-e2c7-4f75-b080-9bed005f003f"). InnerVolumeSpecName "kube-api-access-pkzx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.314409 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6305987e-e2c7-4f75-b080-9bed005f003f" (UID: "6305987e-e2c7-4f75-b080-9bed005f003f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.315141 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6305987e-e2c7-4f75-b080-9bed005f003f" (UID: "6305987e-e2c7-4f75-b080-9bed005f003f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.332497 4878 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.332531 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.332543 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkzx2\" (UniqueName: \"kubernetes.io/projected/6305987e-e2c7-4f75-b080-9bed005f003f-kube-api-access-pkzx2\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.335114 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-config" (OuterVolumeSpecName: "config") pod "6305987e-e2c7-4f75-b080-9bed005f003f" (UID: "6305987e-e2c7-4f75-b080-9bed005f003f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.381220 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6305987e-e2c7-4f75-b080-9bed005f003f" (UID: "6305987e-e2c7-4f75-b080-9bed005f003f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.382597 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6305987e-e2c7-4f75-b080-9bed005f003f" (UID: "6305987e-e2c7-4f75-b080-9bed005f003f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.436050 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.436104 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.436122 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6305987e-e2c7-4f75-b080-9bed005f003f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.842654 4878 generic.go:334] "Generic (PLEG): container finished" podID="f78c0caa-65ba-4a70-a14d-067faf81a1fa" containerID="f07e4edfb83891c4e9b72db710b90e724310d12f096e9b918491ad8dbbcc45a2" exitCode=0 Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.842748 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6697v" event={"ID":"f78c0caa-65ba-4a70-a14d-067faf81a1fa","Type":"ContainerDied","Data":"f07e4edfb83891c4e9b72db710b90e724310d12f096e9b918491ad8dbbcc45a2"} Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.848103 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerStarted","Data":"781f7a301f7b90e4eea9fe07539b0aaa3c285bbebd9ba34ffdd8b7262ede5a49"} Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.848204 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerStarted","Data":"b10aee55cb1ebb6a6d6c82477341c1f3266fbcb29b4fe83eaa5e2bd23d527617"} Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.851818 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" event={"ID":"6305987e-e2c7-4f75-b080-9bed005f003f","Type":"ContainerDied","Data":"0859d0b5fabd8d35fbaa8ec14e1db0448c3881ce8f9d8b410dd0de448ef9d6f1"} Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.851886 4878 scope.go:117] "RemoveContainer" containerID="0932c8c715307b869e16674e1569432e6dedd6e9a2f8ed169b4f99ff68b55052" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.851906 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758d7bc895-6l69g" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.900049 4878 scope.go:117] "RemoveContainer" containerID="eb5e420f5ed70b768be40c7798b2dc4ad33736bacd88cc8069b44b2ee10f35cd" Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.911286 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758d7bc895-6l69g"] Dec 02 18:41:11 crc kubenswrapper[4878]: I1202 18:41:11.924291 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758d7bc895-6l69g"] Dec 02 18:41:12 crc kubenswrapper[4878]: I1202 18:41:12.863825 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerStarted","Data":"a8a5801bbe92f5b112d99995fa026ee15743a7ee2b42de8f736e4a371b40e7fb"} Dec 02 18:41:12 crc kubenswrapper[4878]: I1202 18:41:12.951854 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" path="/var/lib/kubelet/pods/6305987e-e2c7-4f75-b080-9bed005f003f/volumes" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.517616 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.593845 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-combined-ca-bundle\") pod \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.594046 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-config-data\") pod \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.594283 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l4sz\" (UniqueName: \"kubernetes.io/projected/f78c0caa-65ba-4a70-a14d-067faf81a1fa-kube-api-access-4l4sz\") pod \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.594340 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-scripts\") pod \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\" (UID: \"f78c0caa-65ba-4a70-a14d-067faf81a1fa\") " Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.600174 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78c0caa-65ba-4a70-a14d-067faf81a1fa-kube-api-access-4l4sz" (OuterVolumeSpecName: "kube-api-access-4l4sz") pod "f78c0caa-65ba-4a70-a14d-067faf81a1fa" (UID: "f78c0caa-65ba-4a70-a14d-067faf81a1fa"). InnerVolumeSpecName "kube-api-access-4l4sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.600213 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-scripts" (OuterVolumeSpecName: "scripts") pod "f78c0caa-65ba-4a70-a14d-067faf81a1fa" (UID: "f78c0caa-65ba-4a70-a14d-067faf81a1fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.638478 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-config-data" (OuterVolumeSpecName: "config-data") pod "f78c0caa-65ba-4a70-a14d-067faf81a1fa" (UID: "f78c0caa-65ba-4a70-a14d-067faf81a1fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.671289 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f78c0caa-65ba-4a70-a14d-067faf81a1fa" (UID: "f78c0caa-65ba-4a70-a14d-067faf81a1fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.698828 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.698870 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.698883 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l4sz\" (UniqueName: \"kubernetes.io/projected/f78c0caa-65ba-4a70-a14d-067faf81a1fa-kube-api-access-4l4sz\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.698899 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78c0caa-65ba-4a70-a14d-067faf81a1fa-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.880339 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6697v" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.882393 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6697v" event={"ID":"f78c0caa-65ba-4a70-a14d-067faf81a1fa","Type":"ContainerDied","Data":"328045402f7801afc9bb1c05b528d897c0f70da4dc671375ea7e2674f95e29d3"} Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.882445 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328045402f7801afc9bb1c05b528d897c0f70da4dc671375ea7e2674f95e29d3" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.887263 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerStarted","Data":"1ad976fc2bc596411fc3bcabc782d489cd58255538025830e853c1bd52c4a5de"} Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.887648 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:41:13 crc kubenswrapper[4878]: I1202 18:41:13.920709 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.795610449 podStartE2EDuration="4.920688374s" podCreationTimestamp="2025-12-02 18:41:09 +0000 UTC" firstStartedPulling="2025-12-02 18:41:10.210019387 +0000 UTC m=+1579.899638278" lastFinishedPulling="2025-12-02 18:41:13.335097322 +0000 UTC m=+1583.024716203" observedRunningTime="2025-12-02 18:41:13.91467258 +0000 UTC m=+1583.604291471" watchObservedRunningTime="2025-12-02 18:41:13.920688374 +0000 UTC m=+1583.610307255" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.125417 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.125779 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="872e7309-ca28-4a96-98a8-359de3dfc613" containerName="nova-scheduler-scheduler" containerID="cri-o://3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f" gracePeriod=30 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.143471 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.143854 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-log" containerID="cri-o://3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc" gracePeriod=30 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.143977 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-api" containerID="cri-o://03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193" gracePeriod=30 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.169570 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.169891 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-log" containerID="cri-o://14f7cece6b5697d747ca8cf15ee1cb65a6386b9d78e0beecd6e48a6f7138bb4c" gracePeriod=30 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.170049 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-metadata" containerID="cri-o://15ec9813594ccb589e47e88428083dd7f55bc204c8b67348edf9736bd1a00151" gracePeriod=30 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.751187 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.905657 4878 generic.go:334] "Generic (PLEG): container finished" podID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerID="14f7cece6b5697d747ca8cf15ee1cb65a6386b9d78e0beecd6e48a6f7138bb4c" exitCode=143 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.905763 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd","Type":"ContainerDied","Data":"14f7cece6b5697d747ca8cf15ee1cb65a6386b9d78e0beecd6e48a6f7138bb4c"} Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.908800 4878 generic.go:334] "Generic (PLEG): container finished" podID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerID="03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193" exitCode=0 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.908842 4878 generic.go:334] "Generic (PLEG): container finished" podID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerID="3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc" exitCode=143 Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.908870 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2de87751-5aa7-4181-8593-9f7f3cfba1a0","Type":"ContainerDied","Data":"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193"} Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.908899 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.908933 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2de87751-5aa7-4181-8593-9f7f3cfba1a0","Type":"ContainerDied","Data":"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc"} Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.908945 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2de87751-5aa7-4181-8593-9f7f3cfba1a0","Type":"ContainerDied","Data":"207e99b576f2a1d6460d06eae16484b342e30c73de635735146050d1e11db37c"} Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.908963 4878 scope.go:117] "RemoveContainer" containerID="03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.937861 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de87751-5aa7-4181-8593-9f7f3cfba1a0-logs\") pod \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.938031 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-combined-ca-bundle\") pod \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.938120 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-public-tls-certs\") pod \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.938377 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2djkp\" (UniqueName: \"kubernetes.io/projected/2de87751-5aa7-4181-8593-9f7f3cfba1a0-kube-api-access-2djkp\") pod \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.938705 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-internal-tls-certs\") pod \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.939231 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-config-data\") pod \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\" (UID: \"2de87751-5aa7-4181-8593-9f7f3cfba1a0\") " Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.941968 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de87751-5aa7-4181-8593-9f7f3cfba1a0-logs" (OuterVolumeSpecName: "logs") pod "2de87751-5aa7-4181-8593-9f7f3cfba1a0" (UID: "2de87751-5aa7-4181-8593-9f7f3cfba1a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.946955 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de87751-5aa7-4181-8593-9f7f3cfba1a0-kube-api-access-2djkp" (OuterVolumeSpecName: "kube-api-access-2djkp") pod "2de87751-5aa7-4181-8593-9f7f3cfba1a0" (UID: "2de87751-5aa7-4181-8593-9f7f3cfba1a0"). InnerVolumeSpecName "kube-api-access-2djkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.947360 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de87751-5aa7-4181-8593-9f7f3cfba1a0-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.947384 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2djkp\" (UniqueName: \"kubernetes.io/projected/2de87751-5aa7-4181-8593-9f7f3cfba1a0-kube-api-access-2djkp\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.966513 4878 scope.go:117] "RemoveContainer" containerID="3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc" Dec 02 18:41:14 crc kubenswrapper[4878]: I1202 18:41:14.992408 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-config-data" (OuterVolumeSpecName: "config-data") pod "2de87751-5aa7-4181-8593-9f7f3cfba1a0" (UID: "2de87751-5aa7-4181-8593-9f7f3cfba1a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.006689 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2de87751-5aa7-4181-8593-9f7f3cfba1a0" (UID: "2de87751-5aa7-4181-8593-9f7f3cfba1a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.007703 4878 scope.go:117] "RemoveContainer" containerID="03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193" Dec 02 18:41:15 crc kubenswrapper[4878]: E1202 18:41:15.008406 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193\": container with ID starting with 03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193 not found: ID does not exist" containerID="03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.008801 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193"} err="failed to get container status \"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193\": rpc error: code = NotFound desc = could not find container \"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193\": container with ID starting with 03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193 not found: ID does not exist" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.008849 4878 scope.go:117] "RemoveContainer" containerID="3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc" Dec 02 18:41:15 crc kubenswrapper[4878]: E1202 18:41:15.009225 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc\": container with ID starting with 3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc not found: ID does not exist" containerID="3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.009276 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc"} err="failed to get container status \"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc\": rpc error: code = NotFound desc = could not find container \"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc\": container with ID starting with 3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc not found: ID does not exist" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.009304 4878 scope.go:117] "RemoveContainer" containerID="03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.010080 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193"} err="failed to get container status \"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193\": rpc error: code = NotFound desc = could not find container \"03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193\": container with ID starting with 03298f8b12bb5648fcde9bb2f96b596353a2f1809efb99b9378b376a64965193 not found: ID does not exist" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.010112 4878 scope.go:117] "RemoveContainer" containerID="3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.010422 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc"} err="failed to get container status \"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc\": rpc error: code = NotFound desc = could not find container \"3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc\": container with ID starting with 3c11d811c32b850d37b68e8055cfafbfc36110f5ce2fae34499e1d3155c683bc not found: ID does not exist" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.028828 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2de87751-5aa7-4181-8593-9f7f3cfba1a0" (UID: "2de87751-5aa7-4181-8593-9f7f3cfba1a0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.029433 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2de87751-5aa7-4181-8593-9f7f3cfba1a0" (UID: "2de87751-5aa7-4181-8593-9f7f3cfba1a0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.049860 4878 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.050097 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.050206 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.050345 4878 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de87751-5aa7-4181-8593-9f7f3cfba1a0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.348305 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.360831 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.374370 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:15 crc kubenswrapper[4878]: E1202 18:41:15.375050 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" containerName="dnsmasq-dns" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375070 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" containerName="dnsmasq-dns" Dec 02 18:41:15 crc kubenswrapper[4878]: E1202 18:41:15.375103 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c0caa-65ba-4a70-a14d-067faf81a1fa" containerName="nova-manage" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375112 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c0caa-65ba-4a70-a14d-067faf81a1fa" containerName="nova-manage" Dec 02 18:41:15 crc kubenswrapper[4878]: E1202 18:41:15.375122 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-log" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375129 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-log" Dec 02 18:41:15 crc kubenswrapper[4878]: E1202 18:41:15.375142 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" containerName="init" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375149 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" containerName="init" Dec 02 18:41:15 crc kubenswrapper[4878]: E1202 18:41:15.375161 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-api" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375167 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-api" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375392 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-log" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375415 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c0caa-65ba-4a70-a14d-067faf81a1fa" containerName="nova-manage" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375432 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6305987e-e2c7-4f75-b080-9bed005f003f" containerName="dnsmasq-dns" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.375462 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" containerName="nova-api-api" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.377054 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.379476 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.383385 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.383748 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.388676 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.459657 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0dfa-637d-432b-946d-753c5afa72dd-logs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.459945 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-config-data\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.460093 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-public-tls-certs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.460170 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.460293 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5x4\" (UniqueName: \"kubernetes.io/projected/4cfd0dfa-637d-432b-946d-753c5afa72dd-kube-api-access-rk5x4\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.460429 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.561824 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-public-tls-certs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.561902 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.561943 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5x4\" (UniqueName: \"kubernetes.io/projected/4cfd0dfa-637d-432b-946d-753c5afa72dd-kube-api-access-rk5x4\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.562039 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.562144 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0dfa-637d-432b-946d-753c5afa72dd-logs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.562177 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-config-data\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.563202 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfd0dfa-637d-432b-946d-753c5afa72dd-logs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.567041 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-public-tls-certs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.567560 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-config-data\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.568014 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.568501 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfd0dfa-637d-432b-946d-753c5afa72dd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.579745 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5x4\" (UniqueName: \"kubernetes.io/projected/4cfd0dfa-637d-432b-946d-753c5afa72dd-kube-api-access-rk5x4\") pod \"nova-api-0\" (UID: \"4cfd0dfa-637d-432b-946d-753c5afa72dd\") " pod="openstack/nova-api-0" Dec 02 18:41:15 crc kubenswrapper[4878]: I1202 18:41:15.704021 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.244864 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 18:41:16 crc kubenswrapper[4878]: W1202 18:41:16.248473 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cfd0dfa_637d_432b_946d_753c5afa72dd.slice/crio-b15a94b1b5cce582189b1e00dfabc70fa08ed8fe7edac0cf599b1c94a2b4ac39 WatchSource:0}: Error finding container b15a94b1b5cce582189b1e00dfabc70fa08ed8fe7edac0cf599b1c94a2b4ac39: Status 404 returned error can't find the container with id b15a94b1b5cce582189b1e00dfabc70fa08ed8fe7edac0cf599b1c94a2b4ac39 Dec 02 18:41:16 crc kubenswrapper[4878]: E1202 18:41:16.490361 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 18:41:16 crc kubenswrapper[4878]: E1202 18:41:16.496463 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 18:41:16 crc kubenswrapper[4878]: E1202 18:41:16.504766 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 18:41:16 crc kubenswrapper[4878]: E1202 18:41:16.505254 4878 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="872e7309-ca28-4a96-98a8-359de3dfc613" containerName="nova-scheduler-scheduler" Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.976978 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de87751-5aa7-4181-8593-9f7f3cfba1a0" path="/var/lib/kubelet/pods/2de87751-5aa7-4181-8593-9f7f3cfba1a0/volumes" Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.978386 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cfd0dfa-637d-432b-946d-753c5afa72dd","Type":"ContainerStarted","Data":"11d1409fb0e143c658a79f9b00e596a170794aac14fce42c9c292980b6445948"} Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.978427 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cfd0dfa-637d-432b-946d-753c5afa72dd","Type":"ContainerStarted","Data":"2200d6e2c329155a01150e6c863e45b8f7011539f31e0f23d33b5c69fc8963ff"} Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.978438 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cfd0dfa-637d-432b-946d-753c5afa72dd","Type":"ContainerStarted","Data":"b15a94b1b5cce582189b1e00dfabc70fa08ed8fe7edac0cf599b1c94a2b4ac39"} Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.982998 4878 generic.go:334] "Generic (PLEG): container finished" podID="872e7309-ca28-4a96-98a8-359de3dfc613" containerID="3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f" exitCode=0 Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.983057 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872e7309-ca28-4a96-98a8-359de3dfc613","Type":"ContainerDied","Data":"3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f"} Dec 02 18:41:16 crc kubenswrapper[4878]: I1202 18:41:16.987025 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.021692 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.021668132 podStartE2EDuration="2.021668132s" podCreationTimestamp="2025-12-02 18:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:41:16.995966037 +0000 UTC m=+1586.685585068" watchObservedRunningTime="2025-12-02 18:41:17.021668132 +0000 UTC m=+1586.711287013" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.109098 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-combined-ca-bundle\") pod \"872e7309-ca28-4a96-98a8-359de3dfc613\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.109417 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g96tp\" (UniqueName: \"kubernetes.io/projected/872e7309-ca28-4a96-98a8-359de3dfc613-kube-api-access-g96tp\") pod \"872e7309-ca28-4a96-98a8-359de3dfc613\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.109481 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-config-data\") pod \"872e7309-ca28-4a96-98a8-359de3dfc613\" (UID: \"872e7309-ca28-4a96-98a8-359de3dfc613\") " Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.117669 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872e7309-ca28-4a96-98a8-359de3dfc613-kube-api-access-g96tp" (OuterVolumeSpecName: "kube-api-access-g96tp") pod "872e7309-ca28-4a96-98a8-359de3dfc613" (UID: "872e7309-ca28-4a96-98a8-359de3dfc613"). InnerVolumeSpecName "kube-api-access-g96tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.141933 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-config-data" (OuterVolumeSpecName: "config-data") pod "872e7309-ca28-4a96-98a8-359de3dfc613" (UID: "872e7309-ca28-4a96-98a8-359de3dfc613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.156725 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "872e7309-ca28-4a96-98a8-359de3dfc613" (UID: "872e7309-ca28-4a96-98a8-359de3dfc613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.213293 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.213330 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g96tp\" (UniqueName: \"kubernetes.io/projected/872e7309-ca28-4a96-98a8-359de3dfc613-kube-api-access-g96tp\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.213342 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872e7309-ca28-4a96-98a8-359de3dfc613-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.602089 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": dial tcp 10.217.0.249:8775: connect: connection refused" Dec 02 18:41:17 crc kubenswrapper[4878]: I1202 18:41:17.602154 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": dial tcp 10.217.0.249:8775: connect: connection refused" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.001344 4878 generic.go:334] "Generic (PLEG): container finished" podID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerID="15ec9813594ccb589e47e88428083dd7f55bc204c8b67348edf9736bd1a00151" exitCode=0 Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.001446 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd","Type":"ContainerDied","Data":"15ec9813594ccb589e47e88428083dd7f55bc204c8b67348edf9736bd1a00151"} Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.001499 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd","Type":"ContainerDied","Data":"100243abe35d0ceef9a949b1c8906f9c814323ad7a1b6679d3f6b9d1717b8d54"} Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.001516 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="100243abe35d0ceef9a949b1c8906f9c814323ad7a1b6679d3f6b9d1717b8d54" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.005514 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.005761 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872e7309-ca28-4a96-98a8-359de3dfc613","Type":"ContainerDied","Data":"f87455a9f32a99cbc4920b5e41f7e83b4793e7f8c2bb0763af732f5d46908c78"} Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.005893 4878 scope.go:117] "RemoveContainer" containerID="3840ddfd428531d1f9df23ebf12a1d9f43f60bbd95fd20f514a42bbac1ea200f" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.118976 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.166130 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.213056 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.234217 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:41:18 crc kubenswrapper[4878]: E1202 18:41:18.234964 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-metadata" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.234991 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-metadata" Dec 02 18:41:18 crc kubenswrapper[4878]: E1202 18:41:18.235052 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872e7309-ca28-4a96-98a8-359de3dfc613" containerName="nova-scheduler-scheduler" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.235059 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="872e7309-ca28-4a96-98a8-359de3dfc613" containerName="nova-scheduler-scheduler" Dec 02 18:41:18 crc kubenswrapper[4878]: E1202 18:41:18.235068 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-log" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.235073 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-log" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.235311 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-log" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.235334 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" containerName="nova-metadata-metadata" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.235348 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="872e7309-ca28-4a96-98a8-359de3dfc613" containerName="nova-scheduler-scheduler" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.236681 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.239268 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-config-data\") pod \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.239396 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-nova-metadata-tls-certs\") pod \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.239550 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-logs\") pod \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.239669 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-combined-ca-bundle\") pod \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.239919 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mbh4\" (UniqueName: \"kubernetes.io/projected/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-kube-api-access-5mbh4\") pod \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\" (UID: \"fbbde98c-d7a1-405c-ab24-7e2d8fb5effd\") " Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.242181 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-logs" (OuterVolumeSpecName: "logs") pod "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" (UID: "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.278549 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-kube-api-access-5mbh4" (OuterVolumeSpecName: "kube-api-access-5mbh4") pod "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" (UID: "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd"). InnerVolumeSpecName "kube-api-access-5mbh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.278957 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.285663 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.289471 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-config-data" (OuterVolumeSpecName: "config-data") pod "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" (UID: "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.303858 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" (UID: "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.338748 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" (UID: "fbbde98c-d7a1-405c-ab24-7e2d8fb5effd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.344244 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfc6l\" (UniqueName: \"kubernetes.io/projected/5dc0fa21-804e-42bf-a190-4e108c84df48-kube-api-access-jfc6l\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.346115 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc0fa21-804e-42bf-a190-4e108c84df48-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.346581 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc0fa21-804e-42bf-a190-4e108c84df48-config-data\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.346826 4878 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.346850 4878 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-logs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.346864 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.346877 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mbh4\" (UniqueName: \"kubernetes.io/projected/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-kube-api-access-5mbh4\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.346886 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.449545 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc0fa21-804e-42bf-a190-4e108c84df48-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.449894 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc0fa21-804e-42bf-a190-4e108c84df48-config-data\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.450037 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfc6l\" (UniqueName: \"kubernetes.io/projected/5dc0fa21-804e-42bf-a190-4e108c84df48-kube-api-access-jfc6l\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.454551 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc0fa21-804e-42bf-a190-4e108c84df48-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.456626 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc0fa21-804e-42bf-a190-4e108c84df48-config-data\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.474037 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfc6l\" (UniqueName: \"kubernetes.io/projected/5dc0fa21-804e-42bf-a190-4e108c84df48-kube-api-access-jfc6l\") pod \"nova-scheduler-0\" (UID: \"5dc0fa21-804e-42bf-a190-4e108c84df48\") " pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.751633 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 18:41:18 crc kubenswrapper[4878]: I1202 18:41:18.978644 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872e7309-ca28-4a96-98a8-359de3dfc613" path="/var/lib/kubelet/pods/872e7309-ca28-4a96-98a8-359de3dfc613/volumes" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.018683 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.053248 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.112440 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.133619 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.136663 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.139637 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.139687 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.191867 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.273907 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 18:41:19 crc kubenswrapper[4878]: W1202 18:41:19.279453 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc0fa21_804e_42bf_a190_4e108c84df48.slice/crio-8388210fe234aaabcdd99aae5b6e4da0b0da79f6e4463bf54a8592b2d9a2b565 WatchSource:0}: Error finding container 8388210fe234aaabcdd99aae5b6e4da0b0da79f6e4463bf54a8592b2d9a2b565: Status 404 returned error can't find the container with id 8388210fe234aaabcdd99aae5b6e4da0b0da79f6e4463bf54a8592b2d9a2b565 Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.300576 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5217962f-8411-4be7-bbd0-93858938b746-logs\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.301896 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.302058 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pnd\" (UniqueName: \"kubernetes.io/projected/5217962f-8411-4be7-bbd0-93858938b746-kube-api-access-g6pnd\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.302448 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-config-data\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.302678 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.405191 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-config-data\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.405321 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.405387 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5217962f-8411-4be7-bbd0-93858938b746-logs\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.405479 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.405521 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pnd\" (UniqueName: \"kubernetes.io/projected/5217962f-8411-4be7-bbd0-93858938b746-kube-api-access-g6pnd\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.406271 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5217962f-8411-4be7-bbd0-93858938b746-logs\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.410527 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-config-data\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.410532 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.410765 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5217962f-8411-4be7-bbd0-93858938b746-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.424800 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pnd\" (UniqueName: \"kubernetes.io/projected/5217962f-8411-4be7-bbd0-93858938b746-kube-api-access-g6pnd\") pod \"nova-metadata-0\" (UID: \"5217962f-8411-4be7-bbd0-93858938b746\") " pod="openstack/nova-metadata-0" Dec 02 18:41:19 crc kubenswrapper[4878]: I1202 18:41:19.490749 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 18:41:20 crc kubenswrapper[4878]: I1202 18:41:20.041755 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5dc0fa21-804e-42bf-a190-4e108c84df48","Type":"ContainerStarted","Data":"e09840ea7b9d8266a1267e3edfa4112b124963def92939596fe24c2c9924b2cd"} Dec 02 18:41:20 crc kubenswrapper[4878]: I1202 18:41:20.043288 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5dc0fa21-804e-42bf-a190-4e108c84df48","Type":"ContainerStarted","Data":"8388210fe234aaabcdd99aae5b6e4da0b0da79f6e4463bf54a8592b2d9a2b565"} Dec 02 18:41:20 crc kubenswrapper[4878]: I1202 18:41:20.063943 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 18:41:20 crc kubenswrapper[4878]: I1202 18:41:20.084039 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.084012285 podStartE2EDuration="2.084012285s" podCreationTimestamp="2025-12-02 18:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:41:20.060796843 +0000 UTC m=+1589.750415724" watchObservedRunningTime="2025-12-02 18:41:20.084012285 +0000 UTC m=+1589.773631186" Dec 02 18:41:20 crc kubenswrapper[4878]: I1202 18:41:20.964514 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbbde98c-d7a1-405c-ab24-7e2d8fb5effd" path="/var/lib/kubelet/pods/fbbde98c-d7a1-405c-ab24-7e2d8fb5effd/volumes" Dec 02 18:41:21 crc kubenswrapper[4878]: I1202 18:41:21.063469 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5217962f-8411-4be7-bbd0-93858938b746","Type":"ContainerStarted","Data":"724189a5d49d85525e2e4ae770c9386fdb21ae352f9a9a1c52dede9fd2d60310"} Dec 02 18:41:21 crc kubenswrapper[4878]: I1202 18:41:21.063546 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5217962f-8411-4be7-bbd0-93858938b746","Type":"ContainerStarted","Data":"232dfe4d9762bc9c95f37bc847e3200cb64283c389c7bc3e2f39455690bcbe54"} Dec 02 18:41:21 crc kubenswrapper[4878]: I1202 18:41:21.063566 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5217962f-8411-4be7-bbd0-93858938b746","Type":"ContainerStarted","Data":"8d0291be45351793ad3292397acabc72bb744b4e66f4794e069415514839833c"} Dec 02 18:41:23 crc kubenswrapper[4878]: I1202 18:41:23.753376 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 18:41:24 crc kubenswrapper[4878]: I1202 18:41:24.491016 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 18:41:24 crc kubenswrapper[4878]: I1202 18:41:24.491093 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 18:41:25 crc kubenswrapper[4878]: I1202 18:41:25.705261 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 18:41:25 crc kubenswrapper[4878]: I1202 18:41:25.705361 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 18:41:26 crc kubenswrapper[4878]: I1202 18:41:26.726421 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cfd0dfa-637d-432b-946d-753c5afa72dd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.4:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 18:41:26 crc kubenswrapper[4878]: I1202 18:41:26.726470 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cfd0dfa-637d-432b-946d-753c5afa72dd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.4:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 18:41:28 crc kubenswrapper[4878]: I1202 18:41:28.752699 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 18:41:28 crc kubenswrapper[4878]: I1202 18:41:28.811624 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 18:41:28 crc kubenswrapper[4878]: I1202 18:41:28.838543 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.838506831 podStartE2EDuration="9.838506831s" podCreationTimestamp="2025-12-02 18:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:41:21.087157644 +0000 UTC m=+1590.776776606" watchObservedRunningTime="2025-12-02 18:41:28.838506831 +0000 UTC m=+1598.528125742" Dec 02 18:41:29 crc kubenswrapper[4878]: I1202 18:41:29.238677 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 18:41:29 crc kubenswrapper[4878]: I1202 18:41:29.491211 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 18:41:29 crc kubenswrapper[4878]: I1202 18:41:29.491296 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 18:41:30 crc kubenswrapper[4878]: I1202 18:41:30.506458 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5217962f-8411-4be7-bbd0-93858938b746" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 18:41:30 crc kubenswrapper[4878]: I1202 18:41:30.506471 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5217962f-8411-4be7-bbd0-93858938b746" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 18:41:35 crc kubenswrapper[4878]: I1202 18:41:35.714734 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 18:41:35 crc kubenswrapper[4878]: I1202 18:41:35.715529 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 18:41:35 crc kubenswrapper[4878]: I1202 18:41:35.717013 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 18:41:35 crc kubenswrapper[4878]: I1202 18:41:35.717037 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 18:41:35 crc kubenswrapper[4878]: I1202 18:41:35.734276 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 18:41:35 crc kubenswrapper[4878]: I1202 18:41:35.746074 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 18:41:39 crc kubenswrapper[4878]: I1202 18:41:39.498812 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 18:41:39 crc kubenswrapper[4878]: I1202 18:41:39.507971 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 18:41:39 crc kubenswrapper[4878]: I1202 18:41:39.509566 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 18:41:39 crc kubenswrapper[4878]: I1202 18:41:39.652422 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 18:41:40 crc kubenswrapper[4878]: I1202 18:41:40.375380 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 18:41:50 crc kubenswrapper[4878]: I1202 18:41:50.765136 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-xp7h4"] Dec 02 18:41:50 crc kubenswrapper[4878]: I1202 18:41:50.779861 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-xp7h4"] Dec 02 18:41:50 crc kubenswrapper[4878]: I1202 18:41:50.879457 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-fchr4"] Dec 02 18:41:50 crc kubenswrapper[4878]: I1202 18:41:50.881661 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:50 crc kubenswrapper[4878]: I1202 18:41:50.896582 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fchr4"] Dec 02 18:41:50 crc kubenswrapper[4878]: I1202 18:41:50.958350 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf8fbcd-b97a-45f4-8b17-c92fdc87d75b" path="/var/lib/kubelet/pods/faf8fbcd-b97a-45f4-8b17-c92fdc87d75b/volumes" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.050460 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8frb\" (UniqueName: \"kubernetes.io/projected/8de0b145-008a-4a41-aa97-cb01f30d946f-kube-api-access-x8frb\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.050560 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-combined-ca-bundle\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.051307 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-config-data\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.154686 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8frb\" (UniqueName: \"kubernetes.io/projected/8de0b145-008a-4a41-aa97-cb01f30d946f-kube-api-access-x8frb\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.154829 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-combined-ca-bundle\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.154984 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-config-data\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.164642 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-combined-ca-bundle\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.166358 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-config-data\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.173420 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8frb\" (UniqueName: \"kubernetes.io/projected/8de0b145-008a-4a41-aa97-cb01f30d946f-kube-api-access-x8frb\") pod \"heat-db-sync-fchr4\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.209263 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fchr4" Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.766426 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fchr4"] Dec 02 18:41:51 crc kubenswrapper[4878]: W1202 18:41:51.771341 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de0b145_008a_4a41_aa97_cb01f30d946f.slice/crio-0b27fa049a1decf1196bd46034ee5e69989e5a96799d9b4039ccd28687807a38 WatchSource:0}: Error finding container 0b27fa049a1decf1196bd46034ee5e69989e5a96799d9b4039ccd28687807a38: Status 404 returned error can't find the container with id 0b27fa049a1decf1196bd46034ee5e69989e5a96799d9b4039ccd28687807a38 Dec 02 18:41:51 crc kubenswrapper[4878]: I1202 18:41:51.774132 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 18:41:52 crc kubenswrapper[4878]: I1202 18:41:52.556297 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fchr4" event={"ID":"8de0b145-008a-4a41-aa97-cb01f30d946f","Type":"ContainerStarted","Data":"0b27fa049a1decf1196bd46034ee5e69989e5a96799d9b4039ccd28687807a38"} Dec 02 18:41:52 crc kubenswrapper[4878]: I1202 18:41:52.877343 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:41:53 crc kubenswrapper[4878]: I1202 18:41:53.743023 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:41:53 crc kubenswrapper[4878]: I1202 18:41:53.743578 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:41:53 crc kubenswrapper[4878]: I1202 18:41:53.893353 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.196809 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.197134 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-central-agent" containerID="cri-o://b10aee55cb1ebb6a6d6c82477341c1f3266fbcb29b4fe83eaa5e2bd23d527617" gracePeriod=30 Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.197383 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="proxy-httpd" containerID="cri-o://1ad976fc2bc596411fc3bcabc782d489cd58255538025830e853c1bd52c4a5de" gracePeriod=30 Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.197448 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="sg-core" containerID="cri-o://a8a5801bbe92f5b112d99995fa026ee15743a7ee2b42de8f736e4a371b40e7fb" gracePeriod=30 Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.197690 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-notification-agent" containerID="cri-o://781f7a301f7b90e4eea9fe07539b0aaa3c285bbebd9ba34ffdd8b7262ede5a49" gracePeriod=30 Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.604194 4878 generic.go:334] "Generic (PLEG): container finished" podID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerID="1ad976fc2bc596411fc3bcabc782d489cd58255538025830e853c1bd52c4a5de" exitCode=0 Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.604548 4878 generic.go:334] "Generic (PLEG): container finished" podID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerID="a8a5801bbe92f5b112d99995fa026ee15743a7ee2b42de8f736e4a371b40e7fb" exitCode=2 Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.604565 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerDied","Data":"1ad976fc2bc596411fc3bcabc782d489cd58255538025830e853c1bd52c4a5de"} Dec 02 18:41:54 crc kubenswrapper[4878]: I1202 18:41:54.604637 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerDied","Data":"a8a5801bbe92f5b112d99995fa026ee15743a7ee2b42de8f736e4a371b40e7fb"} Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.657354 4878 generic.go:334] "Generic (PLEG): container finished" podID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerID="781f7a301f7b90e4eea9fe07539b0aaa3c285bbebd9ba34ffdd8b7262ede5a49" exitCode=0 Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.657733 4878 generic.go:334] "Generic (PLEG): container finished" podID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerID="b10aee55cb1ebb6a6d6c82477341c1f3266fbcb29b4fe83eaa5e2bd23d527617" exitCode=0 Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.657395 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerDied","Data":"781f7a301f7b90e4eea9fe07539b0aaa3c285bbebd9ba34ffdd8b7262ede5a49"} Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.657781 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerDied","Data":"b10aee55cb1ebb6a6d6c82477341c1f3266fbcb29b4fe83eaa5e2bd23d527617"} Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.657795 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e1c8644c-f198-4162-9d8c-45104c4ab84e","Type":"ContainerDied","Data":"41aa30ef882c17b717c59f7a0ec786ac45f252ee60481640eaebbb3474d40051"} Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.657808 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41aa30ef882c17b717c59f7a0ec786ac45f252ee60481640eaebbb3474d40051" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.663829 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.741677 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-sg-core-conf-yaml\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.741924 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-ceilometer-tls-certs\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.741983 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjcq7\" (UniqueName: \"kubernetes.io/projected/e1c8644c-f198-4162-9d8c-45104c4ab84e-kube-api-access-mjcq7\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.742103 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-log-httpd\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.742128 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-config-data\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.742170 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-run-httpd\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.742374 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-combined-ca-bundle\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.742429 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-scripts\") pod \"e1c8644c-f198-4162-9d8c-45104c4ab84e\" (UID: \"e1c8644c-f198-4162-9d8c-45104c4ab84e\") " Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.750225 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.751751 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.766537 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c8644c-f198-4162-9d8c-45104c4ab84e-kube-api-access-mjcq7" (OuterVolumeSpecName: "kube-api-access-mjcq7") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "kube-api-access-mjcq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.779735 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-scripts" (OuterVolumeSpecName: "scripts") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.845144 4878 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.846591 4878 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e1c8644c-f198-4162-9d8c-45104c4ab84e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.846678 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.846734 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjcq7\" (UniqueName: \"kubernetes.io/projected/e1c8644c-f198-4162-9d8c-45104c4ab84e-kube-api-access-mjcq7\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.869645 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.930577 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.956327 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.956607 4878 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:55 crc kubenswrapper[4878]: I1202 18:41:55.992452 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.032078 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-config-data" (OuterVolumeSpecName: "config-data") pod "e1c8644c-f198-4162-9d8c-45104c4ab84e" (UID: "e1c8644c-f198-4162-9d8c-45104c4ab84e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.059837 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.061253 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c8644c-f198-4162-9d8c-45104c4ab84e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.679009 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.723537 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.737034 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.760948 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:56 crc kubenswrapper[4878]: E1202 18:41:56.761564 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-central-agent" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761589 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-central-agent" Dec 02 18:41:56 crc kubenswrapper[4878]: E1202 18:41:56.761612 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-notification-agent" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761619 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-notification-agent" Dec 02 18:41:56 crc kubenswrapper[4878]: E1202 18:41:56.761633 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="sg-core" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761639 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="sg-core" Dec 02 18:41:56 crc kubenswrapper[4878]: E1202 18:41:56.761660 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="proxy-httpd" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761666 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="proxy-httpd" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761923 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="sg-core" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761941 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="proxy-httpd" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761952 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-notification-agent" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.761967 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" containerName="ceilometer-central-agent" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.765321 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.768762 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.768971 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.769174 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.797247 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.883992 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.884591 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.884813 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgvd5\" (UniqueName: \"kubernetes.io/projected/28b4921f-5e67-4490-83fc-eef206c05083-kube-api-access-kgvd5\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.884975 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.885163 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28b4921f-5e67-4490-83fc-eef206c05083-run-httpd\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.885376 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28b4921f-5e67-4490-83fc-eef206c05083-log-httpd\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.885502 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-config-data\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.885638 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-scripts\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.969752 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c8644c-f198-4162-9d8c-45104c4ab84e" path="/var/lib/kubelet/pods/e1c8644c-f198-4162-9d8c-45104c4ab84e/volumes" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.988593 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28b4921f-5e67-4490-83fc-eef206c05083-log-httpd\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.988665 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-config-data\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.988701 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-scripts\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.988849 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.989026 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.989248 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgvd5\" (UniqueName: \"kubernetes.io/projected/28b4921f-5e67-4490-83fc-eef206c05083-kube-api-access-kgvd5\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.989291 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28b4921f-5e67-4490-83fc-eef206c05083-log-httpd\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.989351 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.989432 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28b4921f-5e67-4490-83fc-eef206c05083-run-httpd\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.992737 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28b4921f-5e67-4490-83fc-eef206c05083-run-httpd\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.995983 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-config-data\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.997094 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:56 crc kubenswrapper[4878]: I1202 18:41:56.997998 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:57 crc kubenswrapper[4878]: I1202 18:41:57.002032 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:57 crc kubenswrapper[4878]: I1202 18:41:57.002713 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28b4921f-5e67-4490-83fc-eef206c05083-scripts\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:57 crc kubenswrapper[4878]: I1202 18:41:57.014056 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgvd5\" (UniqueName: \"kubernetes.io/projected/28b4921f-5e67-4490-83fc-eef206c05083-kube-api-access-kgvd5\") pod \"ceilometer-0\" (UID: \"28b4921f-5e67-4490-83fc-eef206c05083\") " pod="openstack/ceilometer-0" Dec 02 18:41:57 crc kubenswrapper[4878]: I1202 18:41:57.091421 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 18:41:57 crc kubenswrapper[4878]: I1202 18:41:57.810458 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 18:41:58 crc kubenswrapper[4878]: I1202 18:41:58.761652 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="rabbitmq" containerID="cri-o://ef6402595a754f87415db1eafef9e466691a9112edf2bc427e42ae1c310ef752" gracePeriod=604795 Dec 02 18:41:58 crc kubenswrapper[4878]: I1202 18:41:58.785620 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28b4921f-5e67-4490-83fc-eef206c05083","Type":"ContainerStarted","Data":"a634d6d18f0aafdfd5018676f85628a397539e23b77825f016b0e563367b76e3"} Dec 02 18:41:59 crc kubenswrapper[4878]: I1202 18:41:59.440060 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="rabbitmq" containerID="cri-o://b33c5557685167291d2cada300c01f364fcc4cc0cdd87196cad4fdf4c3468440" gracePeriod=604795 Dec 02 18:41:59 crc kubenswrapper[4878]: I1202 18:41:59.896761 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Dec 02 18:42:00 crc kubenswrapper[4878]: I1202 18:42:00.368344 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Dec 02 18:42:05 crc kubenswrapper[4878]: I1202 18:42:05.994815 4878 generic.go:334] "Generic (PLEG): container finished" podID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerID="b33c5557685167291d2cada300c01f364fcc4cc0cdd87196cad4fdf4c3468440" exitCode=0 Dec 02 18:42:05 crc kubenswrapper[4878]: I1202 18:42:05.994938 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"168172d2-5cc8-492f-aa26-bd2a1351cdf2","Type":"ContainerDied","Data":"b33c5557685167291d2cada300c01f364fcc4cc0cdd87196cad4fdf4c3468440"} Dec 02 18:42:06 crc kubenswrapper[4878]: I1202 18:42:06.148264 4878 generic.go:334] "Generic (PLEG): container finished" podID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerID="ef6402595a754f87415db1eafef9e466691a9112edf2bc427e42ae1c310ef752" exitCode=0 Dec 02 18:42:06 crc kubenswrapper[4878]: I1202 18:42:06.148289 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8ce834c-073d-4062-b3ee-488fa79aae4f","Type":"ContainerDied","Data":"ef6402595a754f87415db1eafef9e466691a9112edf2bc427e42ae1c310ef752"} Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.262105 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78b8b6c8f9-k99xz"] Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.265850 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.269139 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.277947 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b8b6c8f9-k99xz"] Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.448377 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-config\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.448452 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-openstack-edpm-ipam\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.448490 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-sb\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.448534 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-nb\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.448569 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-swift-storage-0\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.448660 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-svc\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.448768 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p49nx\" (UniqueName: \"kubernetes.io/projected/babb8f49-4a74-45a2-8d2f-af714429151c-kube-api-access-p49nx\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.550337 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-svc\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.550482 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p49nx\" (UniqueName: \"kubernetes.io/projected/babb8f49-4a74-45a2-8d2f-af714429151c-kube-api-access-p49nx\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.550544 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-config\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.550570 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-openstack-edpm-ipam\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.550601 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-sb\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.550619 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-nb\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.550641 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-swift-storage-0\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.551317 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-swift-storage-0\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.551323 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-svc\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.551966 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-openstack-edpm-ipam\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.552217 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-sb\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.552533 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-config\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.553371 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-nb\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.581106 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p49nx\" (UniqueName: \"kubernetes.io/projected/babb8f49-4a74-45a2-8d2f-af714429151c-kube-api-access-p49nx\") pod \"dnsmasq-dns-78b8b6c8f9-k99xz\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:08 crc kubenswrapper[4878]: I1202 18:42:08.595463 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.381386 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.416969 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-tls\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.417045 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/168172d2-5cc8-492f-aa26-bd2a1351cdf2-pod-info\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.417074 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-confd\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.417140 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-server-conf\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.417214 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb6xb\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-kube-api-access-zb6xb\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.417399 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-plugins-conf\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.417460 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.417500 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-config-data\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.418901 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-erlang-cookie\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.418942 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/168172d2-5cc8-492f-aa26-bd2a1351cdf2-erlang-cookie-secret\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.419227 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-plugins\") pod \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\" (UID: \"168172d2-5cc8-492f-aa26-bd2a1351cdf2\") " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.420911 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.425397 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.429731 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.431335 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.434706 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.450577 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-kube-api-access-zb6xb" (OuterVolumeSpecName: "kube-api-access-zb6xb") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "kube-api-access-zb6xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.451126 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168172d2-5cc8-492f-aa26-bd2a1351cdf2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.460846 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/168172d2-5cc8-492f-aa26-bd2a1351cdf2-pod-info" (OuterVolumeSpecName: "pod-info") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524006 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524053 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524063 4878 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/168172d2-5cc8-492f-aa26-bd2a1351cdf2-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524118 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb6xb\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-kube-api-access-zb6xb\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524130 4878 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524185 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524197 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.524206 4878 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/168172d2-5cc8-492f-aa26-bd2a1351cdf2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.531692 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-server-conf" (OuterVolumeSpecName: "server-conf") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.556856 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-config-data" (OuterVolumeSpecName: "config-data") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.577311 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.637804 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.637845 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.637861 4878 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/168172d2-5cc8-492f-aa26-bd2a1351cdf2-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.646415 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "168172d2-5cc8-492f-aa26-bd2a1351cdf2" (UID: "168172d2-5cc8-492f-aa26-bd2a1351cdf2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:10 crc kubenswrapper[4878]: I1202 18:42:10.741009 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/168172d2-5cc8-492f-aa26-bd2a1351cdf2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.232495 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"168172d2-5cc8-492f-aa26-bd2a1351cdf2","Type":"ContainerDied","Data":"faa17f71d92af21878e9be5bce43baf7e7eba2dee4260cb098a2db0d271c5caf"} Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.232885 4878 scope.go:117] "RemoveContainer" containerID="b33c5557685167291d2cada300c01f364fcc4cc0cdd87196cad4fdf4c3468440" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.232569 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.334660 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.464207 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.507793 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:42:11 crc kubenswrapper[4878]: E1202 18:42:11.509981 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="setup-container" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.510006 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="setup-container" Dec 02 18:42:11 crc kubenswrapper[4878]: E1202 18:42:11.510045 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="rabbitmq" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.510052 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="rabbitmq" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.510925 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="rabbitmq" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.514087 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.532837 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lrgf5" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.532936 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.533362 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.533406 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.533431 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.532842 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.533807 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.557186 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578327 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578459 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578594 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578648 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578714 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kckxs\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-kube-api-access-kckxs\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578785 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578823 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578848 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.578959 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eedb789b-6bed-4a82-82c1-977a633ed304-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.579005 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eedb789b-6bed-4a82-82c1-977a633ed304-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.579027 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681037 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681104 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681142 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kckxs\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-kube-api-access-kckxs\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681177 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681198 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681220 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681563 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681593 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.681834 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eedb789b-6bed-4a82-82c1-977a633ed304-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.682346 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.682370 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.682463 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eedb789b-6bed-4a82-82c1-977a633ed304-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.682517 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.683070 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.683152 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.683767 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.684351 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eedb789b-6bed-4a82-82c1-977a633ed304-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.697370 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eedb789b-6bed-4a82-82c1-977a633ed304-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.697941 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eedb789b-6bed-4a82-82c1-977a633ed304-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.698776 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.699647 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.707323 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kckxs\" (UniqueName: \"kubernetes.io/projected/eedb789b-6bed-4a82-82c1-977a633ed304-kube-api-access-kckxs\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.742537 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eedb789b-6bed-4a82-82c1-977a633ed304\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:11 crc kubenswrapper[4878]: I1202 18:42:11.873301 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:12 crc kubenswrapper[4878]: I1202 18:42:12.955488 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" path="/var/lib/kubelet/pods/168172d2-5cc8-492f-aa26-bd2a1351cdf2/volumes" Dec 02 18:42:14 crc kubenswrapper[4878]: I1202 18:42:14.896396 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="168172d2-5cc8-492f-aa26-bd2a1351cdf2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: i/o timeout" Dec 02 18:42:15 crc kubenswrapper[4878]: I1202 18:42:15.370858 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: i/o timeout" Dec 02 18:42:18 crc kubenswrapper[4878]: I1202 18:42:18.861120 4878 scope.go:117] "RemoveContainer" containerID="f47b77d298db6e299054ba2f8bdad64cb6c8cb2bbfbcbc308213e7b1e3765941" Dec 02 18:42:19 crc kubenswrapper[4878]: E1202 18:42:19.530368 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 02 18:42:19 crc kubenswrapper[4878]: E1202 18:42:19.530671 4878 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 02 18:42:19 crc kubenswrapper[4878]: E1202 18:42:19.530850 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n89h54fh89h5cch59dh558h599h688h67fh87hd8hcch56ch645h699h68ch5bh59h5fdh5c4h659h9h5fdh6fh695h5f8h5d4h64dh5b5h5ch574h585q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kgvd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(28b4921f-5e67-4490-83fc-eef206c05083): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.628601 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804140 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-tls\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804221 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-plugins\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804278 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804427 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b47q\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-kube-api-access-5b47q\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804460 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-confd\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804519 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ce834c-073d-4062-b3ee-488fa79aae4f-erlang-cookie-secret\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804567 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ce834c-073d-4062-b3ee-488fa79aae4f-pod-info\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804615 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-config-data\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804773 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-erlang-cookie\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804800 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.804840 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-plugins-conf\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.807525 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.809306 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.809450 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.814507 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ce834c-073d-4062-b3ee-488fa79aae4f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.814665 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-kube-api-access-5b47q" (OuterVolumeSpecName: "kube-api-access-5b47q") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "kube-api-access-5b47q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.814938 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.815899 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.830280 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8ce834c-073d-4062-b3ee-488fa79aae4f-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.880945 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-config-data" (OuterVolumeSpecName: "config-data") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.911058 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.917165 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf\") pod \"b8ce834c-073d-4062-b3ee-488fa79aae4f\" (UID: \"b8ce834c-073d-4062-b3ee-488fa79aae4f\") " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918406 4878 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8ce834c-073d-4062-b3ee-488fa79aae4f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918431 4878 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8ce834c-073d-4062-b3ee-488fa79aae4f-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918441 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918454 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918470 4878 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918481 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918490 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918513 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918522 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b47q\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-kube-api-access-5b47q\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:19 crc kubenswrapper[4878]: W1202 18:42:19.918683 4878 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b8ce834c-073d-4062-b3ee-488fa79aae4f/volumes/kubernetes.io~configmap/server-conf Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.918707 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.971508 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8ce834c-073d-4062-b3ee-488fa79aae4f" (UID: "b8ce834c-073d-4062-b3ee-488fa79aae4f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:19 crc kubenswrapper[4878]: I1202 18:42:19.981883 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.023057 4878 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8ce834c-073d-4062-b3ee-488fa79aae4f-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.023105 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.023119 4878 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8ce834c-073d-4062-b3ee-488fa79aae4f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:20 crc kubenswrapper[4878]: E1202 18:42:20.277686 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 02 18:42:20 crc kubenswrapper[4878]: E1202 18:42:20.278031 4878 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 02 18:42:20 crc kubenswrapper[4878]: E1202 18:42:20.278288 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8frb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-fchr4_openstack(8de0b145-008a-4a41-aa97-cb01f30d946f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 18:42:20 crc kubenswrapper[4878]: E1202 18:42:20.279866 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-fchr4" podUID="8de0b145-008a-4a41-aa97-cb01f30d946f" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.418881 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8ce834c-073d-4062-b3ee-488fa79aae4f","Type":"ContainerDied","Data":"c2c233dc442dc246c673af6338f994d8785c44fdaa7da2cbc5b33b1babdd85d6"} Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.418945 4878 scope.go:117] "RemoveContainer" containerID="ef6402595a754f87415db1eafef9e466691a9112edf2bc427e42ae1c310ef752" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.419117 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: E1202 18:42:20.432305 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-fchr4" podUID="8de0b145-008a-4a41-aa97-cb01f30d946f" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.506929 4878 scope.go:117] "RemoveContainer" containerID="33bf4bb167f90d4156d2bdcea9dab49b9f0ec8a779f82ec97a18a80fccfefc6a" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.519398 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.534275 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.559328 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:42:20 crc kubenswrapper[4878]: E1202 18:42:20.560100 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="setup-container" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.560119 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="setup-container" Dec 02 18:42:20 crc kubenswrapper[4878]: E1202 18:42:20.560153 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="rabbitmq" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.560160 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="rabbitmq" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.560454 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" containerName="rabbitmq" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.561821 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.563815 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.564155 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jwm5r" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.564318 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.564411 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.564546 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.564653 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.571714 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.590364 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741332 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btgb5\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-kube-api-access-btgb5\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741429 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741482 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-config-data\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741514 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741578 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741631 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741654 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741678 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.741743 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.742114 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.742182 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.804341 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 18:42:20 crc kubenswrapper[4878]: W1202 18:42:20.812381 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeedb789b_6bed_4a82_82c1_977a633ed304.slice/crio-cd96990a287a3f982f970bdeb33801eae120124caff8a9d3b70b08014b5b67ae WatchSource:0}: Error finding container cd96990a287a3f982f970bdeb33801eae120124caff8a9d3b70b08014b5b67ae: Status 404 returned error can't find the container with id cd96990a287a3f982f970bdeb33801eae120124caff8a9d3b70b08014b5b67ae Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845349 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845428 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-config-data\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845458 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845483 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845516 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845531 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845553 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845602 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845652 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845680 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.845734 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btgb5\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-kube-api-access-btgb5\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.846634 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.846632 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.847318 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-config-data\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.847479 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.847748 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.851016 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.852666 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.853414 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.861578 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.863021 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btgb5\" (UniqueName: \"kubernetes.io/projected/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-kube-api-access-btgb5\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.863193 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.910129 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da\") " pod="openstack/rabbitmq-server-0" Dec 02 18:42:20 crc kubenswrapper[4878]: I1202 18:42:20.938143 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.032133 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ce834c-073d-4062-b3ee-488fa79aae4f" path="/var/lib/kubelet/pods/b8ce834c-073d-4062-b3ee-488fa79aae4f/volumes" Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.039895 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b8b6c8f9-k99xz"] Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.447381 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eedb789b-6bed-4a82-82c1-977a633ed304","Type":"ContainerStarted","Data":"cd96990a287a3f982f970bdeb33801eae120124caff8a9d3b70b08014b5b67ae"} Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.449511 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28b4921f-5e67-4490-83fc-eef206c05083","Type":"ContainerStarted","Data":"c8310b41fb44e66b4ddc126f57a62b034e6c721bd77d862f9ca2068f2491e300"} Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.451101 4878 generic.go:334] "Generic (PLEG): container finished" podID="babb8f49-4a74-45a2-8d2f-af714429151c" containerID="7a4c7f640d6f7de677097cb7c04dc8d549b52405ec420980b1049b7a639f7b9f" exitCode=0 Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.451133 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" event={"ID":"babb8f49-4a74-45a2-8d2f-af714429151c","Type":"ContainerDied","Data":"7a4c7f640d6f7de677097cb7c04dc8d549b52405ec420980b1049b7a639f7b9f"} Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.451202 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" event={"ID":"babb8f49-4a74-45a2-8d2f-af714429151c","Type":"ContainerStarted","Data":"7391a399c3669603bb782bcfd068223332ce982587f95ae3c7f2f3ad2452092d"} Dec 02 18:42:21 crc kubenswrapper[4878]: I1202 18:42:21.600632 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 18:42:22 crc kubenswrapper[4878]: I1202 18:42:22.468784 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28b4921f-5e67-4490-83fc-eef206c05083","Type":"ContainerStarted","Data":"a41a95eeba5eecb38fe3ad707797d33f8c90f0fa7d67f1f1c76187a9af228fd0"} Dec 02 18:42:22 crc kubenswrapper[4878]: I1202 18:42:22.471803 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" event={"ID":"babb8f49-4a74-45a2-8d2f-af714429151c","Type":"ContainerStarted","Data":"38a97459434489e3de11ef65e61a08e7eb282c93eef1f62679f42f7971fae375"} Dec 02 18:42:22 crc kubenswrapper[4878]: I1202 18:42:22.473498 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:22 crc kubenswrapper[4878]: I1202 18:42:22.474820 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da","Type":"ContainerStarted","Data":"a84e1f915959a01b5e761aad197d27ca5f0b889b2b14208fba1384409bbaa514"} Dec 02 18:42:22 crc kubenswrapper[4878]: I1202 18:42:22.507175 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" podStartSLOduration=14.507150359 podStartE2EDuration="14.507150359s" podCreationTimestamp="2025-12-02 18:42:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:42:22.500897624 +0000 UTC m=+1652.190516525" watchObservedRunningTime="2025-12-02 18:42:22.507150359 +0000 UTC m=+1652.196769240" Dec 02 18:42:23 crc kubenswrapper[4878]: I1202 18:42:23.487095 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eedb789b-6bed-4a82-82c1-977a633ed304","Type":"ContainerStarted","Data":"d4de0d6fad654ab5f0827246c93c3efa182f052ba8d83c5e52763353bad93b4d"} Dec 02 18:42:23 crc kubenswrapper[4878]: I1202 18:42:23.742050 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:42:23 crc kubenswrapper[4878]: I1202 18:42:23.742108 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:42:24 crc kubenswrapper[4878]: E1202 18:42:24.262381 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="28b4921f-5e67-4490-83fc-eef206c05083" Dec 02 18:42:24 crc kubenswrapper[4878]: I1202 18:42:24.503136 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da","Type":"ContainerStarted","Data":"5816ce723c0e2ba26944dcb41e95a33e14a9b0732e101dcbf673de732b0974f1"} Dec 02 18:42:24 crc kubenswrapper[4878]: I1202 18:42:24.505517 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28b4921f-5e67-4490-83fc-eef206c05083","Type":"ContainerStarted","Data":"112fdf3eb796700387b6381fe31675b3fc61af751edf6783037ebd69ecdcc229"} Dec 02 18:42:24 crc kubenswrapper[4878]: E1202 18:42:24.507569 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="28b4921f-5e67-4490-83fc-eef206c05083" Dec 02 18:42:25 crc kubenswrapper[4878]: I1202 18:42:25.519869 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 18:42:25 crc kubenswrapper[4878]: E1202 18:42:25.526858 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="28b4921f-5e67-4490-83fc-eef206c05083" Dec 02 18:42:26 crc kubenswrapper[4878]: E1202 18:42:26.540393 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="28b4921f-5e67-4490-83fc-eef206c05083" Dec 02 18:42:28 crc kubenswrapper[4878]: I1202 18:42:28.597526 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:28 crc kubenswrapper[4878]: I1202 18:42:28.746793 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8f58d9b47-c9bc9"] Dec 02 18:42:28 crc kubenswrapper[4878]: I1202 18:42:28.757222 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" podUID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerName="dnsmasq-dns" containerID="cri-o://35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb" gracePeriod=10 Dec 02 18:42:28 crc kubenswrapper[4878]: I1202 18:42:28.910920 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d594564dc-vbhmw"] Dec 02 18:42:28 crc kubenswrapper[4878]: I1202 18:42:28.919728 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:28 crc kubenswrapper[4878]: I1202 18:42:28.977734 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d594564dc-vbhmw"] Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.039120 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.039185 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-dns-svc\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.039272 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.039326 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqn67\" (UniqueName: \"kubernetes.io/projected/66af733a-2f8b-4127-9c9c-00d137d8eb4e-kube-api-access-pqn67\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.039409 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-dns-swift-storage-0\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.039460 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-config\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.039521 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.141921 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.141966 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-dns-svc\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.142016 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.142112 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqn67\" (UniqueName: \"kubernetes.io/projected/66af733a-2f8b-4127-9c9c-00d137d8eb4e-kube-api-access-pqn67\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.142230 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-dns-swift-storage-0\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.142310 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-config\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.142364 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.142912 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.143659 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-dns-svc\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.143685 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.143745 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-dns-swift-storage-0\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.144044 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.145392 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66af733a-2f8b-4127-9c9c-00d137d8eb4e-config\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.169575 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqn67\" (UniqueName: \"kubernetes.io/projected/66af733a-2f8b-4127-9c9c-00d137d8eb4e-kube-api-access-pqn67\") pod \"dnsmasq-dns-6d594564dc-vbhmw\" (UID: \"66af733a-2f8b-4127-9c9c-00d137d8eb4e\") " pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.275954 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.503569 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.586916 4878 generic.go:334] "Generic (PLEG): container finished" podID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerID="35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb" exitCode=0 Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.586969 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" event={"ID":"45e92901-83f8-42f0-8bfb-ff6cb1805d81","Type":"ContainerDied","Data":"35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb"} Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.586999 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.587021 4878 scope.go:117] "RemoveContainer" containerID="35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.587007 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f58d9b47-c9bc9" event={"ID":"45e92901-83f8-42f0-8bfb-ff6cb1805d81","Type":"ContainerDied","Data":"185293ca4651006b6d60c10e741a8b61408f2d06769b0842dfd708d90c8f26de"} Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.647318 4878 scope.go:117] "RemoveContainer" containerID="e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.658035 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-svc\") pod \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.658153 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-nb\") pod \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.658197 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-swift-storage-0\") pod \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.658296 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8p4g\" (UniqueName: \"kubernetes.io/projected/45e92901-83f8-42f0-8bfb-ff6cb1805d81-kube-api-access-p8p4g\") pod \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.658390 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-sb\") pod \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.658467 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-config\") pod \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\" (UID: \"45e92901-83f8-42f0-8bfb-ff6cb1805d81\") " Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.672500 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e92901-83f8-42f0-8bfb-ff6cb1805d81-kube-api-access-p8p4g" (OuterVolumeSpecName: "kube-api-access-p8p4g") pod "45e92901-83f8-42f0-8bfb-ff6cb1805d81" (UID: "45e92901-83f8-42f0-8bfb-ff6cb1805d81"). InnerVolumeSpecName "kube-api-access-p8p4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.693206 4878 scope.go:117] "RemoveContainer" containerID="35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb" Dec 02 18:42:29 crc kubenswrapper[4878]: E1202 18:42:29.698522 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb\": container with ID starting with 35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb not found: ID does not exist" containerID="35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.698573 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb"} err="failed to get container status \"35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb\": rpc error: code = NotFound desc = could not find container \"35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb\": container with ID starting with 35aef475224e0b69a253e10e31b046715a420fa009197bf5c3a9989e227223cb not found: ID does not exist" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.698606 4878 scope.go:117] "RemoveContainer" containerID="e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d" Dec 02 18:42:29 crc kubenswrapper[4878]: E1202 18:42:29.699152 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d\": container with ID starting with e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d not found: ID does not exist" containerID="e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.699196 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d"} err="failed to get container status \"e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d\": rpc error: code = NotFound desc = could not find container \"e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d\": container with ID starting with e9e5d29b70b7d19a58bd876853389224d3d6060d20419ea7835d1863e09d658d not found: ID does not exist" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.740141 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-config" (OuterVolumeSpecName: "config") pod "45e92901-83f8-42f0-8bfb-ff6cb1805d81" (UID: "45e92901-83f8-42f0-8bfb-ff6cb1805d81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.761576 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45e92901-83f8-42f0-8bfb-ff6cb1805d81" (UID: "45e92901-83f8-42f0-8bfb-ff6cb1805d81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.762410 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8p4g\" (UniqueName: \"kubernetes.io/projected/45e92901-83f8-42f0-8bfb-ff6cb1805d81-kube-api-access-p8p4g\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.762495 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.762550 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.764276 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45e92901-83f8-42f0-8bfb-ff6cb1805d81" (UID: "45e92901-83f8-42f0-8bfb-ff6cb1805d81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.775944 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45e92901-83f8-42f0-8bfb-ff6cb1805d81" (UID: "45e92901-83f8-42f0-8bfb-ff6cb1805d81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.788031 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45e92901-83f8-42f0-8bfb-ff6cb1805d81" (UID: "45e92901-83f8-42f0-8bfb-ff6cb1805d81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.855821 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d594564dc-vbhmw"] Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.865083 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.865115 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:29 crc kubenswrapper[4878]: I1202 18:42:29.865127 4878 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45e92901-83f8-42f0-8bfb-ff6cb1805d81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:30 crc kubenswrapper[4878]: I1202 18:42:30.100009 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8f58d9b47-c9bc9"] Dec 02 18:42:30 crc kubenswrapper[4878]: I1202 18:42:30.113864 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8f58d9b47-c9bc9"] Dec 02 18:42:30 crc kubenswrapper[4878]: E1202 18:42:30.257007 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e92901_83f8_42f0_8bfb_ff6cb1805d81.slice\": RecentStats: unable to find data in memory cache]" Dec 02 18:42:30 crc kubenswrapper[4878]: I1202 18:42:30.602717 4878 generic.go:334] "Generic (PLEG): container finished" podID="66af733a-2f8b-4127-9c9c-00d137d8eb4e" containerID="7ff47f3f67ea202b9112b71752e01f34c28599eeddce204227a449a3a9d60a0b" exitCode=0 Dec 02 18:42:30 crc kubenswrapper[4878]: I1202 18:42:30.602785 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" event={"ID":"66af733a-2f8b-4127-9c9c-00d137d8eb4e","Type":"ContainerDied","Data":"7ff47f3f67ea202b9112b71752e01f34c28599eeddce204227a449a3a9d60a0b"} Dec 02 18:42:30 crc kubenswrapper[4878]: I1202 18:42:30.602858 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" event={"ID":"66af733a-2f8b-4127-9c9c-00d137d8eb4e","Type":"ContainerStarted","Data":"7f6cf91e24d01e5234bf514b9141b533677033062c46374f5fa91eb412932ae9"} Dec 02 18:42:30 crc kubenswrapper[4878]: I1202 18:42:30.956522 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" path="/var/lib/kubelet/pods/45e92901-83f8-42f0-8bfb-ff6cb1805d81/volumes" Dec 02 18:42:31 crc kubenswrapper[4878]: I1202 18:42:31.629676 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" event={"ID":"66af733a-2f8b-4127-9c9c-00d137d8eb4e","Type":"ContainerStarted","Data":"140965bbbc76a8ecefdc778ea7a2efd5263ab5eb881bf94e1a6be1287941f4f8"} Dec 02 18:42:31 crc kubenswrapper[4878]: I1202 18:42:31.630498 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:31 crc kubenswrapper[4878]: I1202 18:42:31.657961 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" podStartSLOduration=3.6579403790000002 podStartE2EDuration="3.657940379s" podCreationTimestamp="2025-12-02 18:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:42:31.648903637 +0000 UTC m=+1661.338522548" watchObservedRunningTime="2025-12-02 18:42:31.657940379 +0000 UTC m=+1661.347559260" Dec 02 18:42:34 crc kubenswrapper[4878]: I1202 18:42:34.685580 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fchr4" event={"ID":"8de0b145-008a-4a41-aa97-cb01f30d946f","Type":"ContainerStarted","Data":"1cdca8f6f5b8e9f527c3630f2eac732cef57ddfc8d0d2a9d3b0f7d2d9cdfc72d"} Dec 02 18:42:34 crc kubenswrapper[4878]: I1202 18:42:34.714758 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-fchr4" podStartSLOduration=2.314063896 podStartE2EDuration="44.714728222s" podCreationTimestamp="2025-12-02 18:41:50 +0000 UTC" firstStartedPulling="2025-12-02 18:41:51.773789635 +0000 UTC m=+1621.463408526" lastFinishedPulling="2025-12-02 18:42:34.174453951 +0000 UTC m=+1663.864072852" observedRunningTime="2025-12-02 18:42:34.708002273 +0000 UTC m=+1664.397621164" watchObservedRunningTime="2025-12-02 18:42:34.714728222 +0000 UTC m=+1664.404347103" Dec 02 18:42:37 crc kubenswrapper[4878]: I1202 18:42:37.728782 4878 generic.go:334] "Generic (PLEG): container finished" podID="8de0b145-008a-4a41-aa97-cb01f30d946f" containerID="1cdca8f6f5b8e9f527c3630f2eac732cef57ddfc8d0d2a9d3b0f7d2d9cdfc72d" exitCode=0 Dec 02 18:42:37 crc kubenswrapper[4878]: I1202 18:42:37.728933 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fchr4" event={"ID":"8de0b145-008a-4a41-aa97-cb01f30d946f","Type":"ContainerDied","Data":"1cdca8f6f5b8e9f527c3630f2eac732cef57ddfc8d0d2a9d3b0f7d2d9cdfc72d"} Dec 02 18:42:38 crc kubenswrapper[4878]: I1202 18:42:38.954587 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.279446 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d594564dc-vbhmw" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.378613 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fchr4" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.388899 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b8b6c8f9-k99xz"] Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.389209 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" podUID="babb8f49-4a74-45a2-8d2f-af714429151c" containerName="dnsmasq-dns" containerID="cri-o://38a97459434489e3de11ef65e61a08e7eb282c93eef1f62679f42f7971fae375" gracePeriod=10 Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.472629 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-combined-ca-bundle\") pod \"8de0b145-008a-4a41-aa97-cb01f30d946f\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.472914 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-config-data\") pod \"8de0b145-008a-4a41-aa97-cb01f30d946f\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.473122 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8frb\" (UniqueName: \"kubernetes.io/projected/8de0b145-008a-4a41-aa97-cb01f30d946f-kube-api-access-x8frb\") pod \"8de0b145-008a-4a41-aa97-cb01f30d946f\" (UID: \"8de0b145-008a-4a41-aa97-cb01f30d946f\") " Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.484823 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de0b145-008a-4a41-aa97-cb01f30d946f-kube-api-access-x8frb" (OuterVolumeSpecName: "kube-api-access-x8frb") pod "8de0b145-008a-4a41-aa97-cb01f30d946f" (UID: "8de0b145-008a-4a41-aa97-cb01f30d946f"). InnerVolumeSpecName "kube-api-access-x8frb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.564801 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de0b145-008a-4a41-aa97-cb01f30d946f" (UID: "8de0b145-008a-4a41-aa97-cb01f30d946f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.576032 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.576060 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8frb\" (UniqueName: \"kubernetes.io/projected/8de0b145-008a-4a41-aa97-cb01f30d946f-kube-api-access-x8frb\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.693750 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-config-data" (OuterVolumeSpecName: "config-data") pod "8de0b145-008a-4a41-aa97-cb01f30d946f" (UID: "8de0b145-008a-4a41-aa97-cb01f30d946f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.759426 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fchr4" event={"ID":"8de0b145-008a-4a41-aa97-cb01f30d946f","Type":"ContainerDied","Data":"0b27fa049a1decf1196bd46034ee5e69989e5a96799d9b4039ccd28687807a38"} Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.759479 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b27fa049a1decf1196bd46034ee5e69989e5a96799d9b4039ccd28687807a38" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.759557 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fchr4" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.784014 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de0b145-008a-4a41-aa97-cb01f30d946f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.791371 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28b4921f-5e67-4490-83fc-eef206c05083","Type":"ContainerStarted","Data":"912c76b35c58fa39a1dc4f3b7cfb954e250282264ca81d6e0eb97c46d89bd50a"} Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.801729 4878 generic.go:334] "Generic (PLEG): container finished" podID="babb8f49-4a74-45a2-8d2f-af714429151c" containerID="38a97459434489e3de11ef65e61a08e7eb282c93eef1f62679f42f7971fae375" exitCode=0 Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.801782 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" event={"ID":"babb8f49-4a74-45a2-8d2f-af714429151c","Type":"ContainerDied","Data":"38a97459434489e3de11ef65e61a08e7eb282c93eef1f62679f42f7971fae375"} Dec 02 18:42:39 crc kubenswrapper[4878]: I1202 18:42:39.860818 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.538727192 podStartE2EDuration="43.860757806s" podCreationTimestamp="2025-12-02 18:41:56 +0000 UTC" firstStartedPulling="2025-12-02 18:41:57.806322162 +0000 UTC m=+1627.495941043" lastFinishedPulling="2025-12-02 18:42:39.128352776 +0000 UTC m=+1668.817971657" observedRunningTime="2025-12-02 18:42:39.833376052 +0000 UTC m=+1669.522994933" watchObservedRunningTime="2025-12-02 18:42:39.860757806 +0000 UTC m=+1669.550376707" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.026654 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.197517 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-svc\") pod \"babb8f49-4a74-45a2-8d2f-af714429151c\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.197713 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p49nx\" (UniqueName: \"kubernetes.io/projected/babb8f49-4a74-45a2-8d2f-af714429151c-kube-api-access-p49nx\") pod \"babb8f49-4a74-45a2-8d2f-af714429151c\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.198201 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-sb\") pod \"babb8f49-4a74-45a2-8d2f-af714429151c\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.198228 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-nb\") pod \"babb8f49-4a74-45a2-8d2f-af714429151c\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.198315 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-swift-storage-0\") pod \"babb8f49-4a74-45a2-8d2f-af714429151c\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.198527 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-config\") pod \"babb8f49-4a74-45a2-8d2f-af714429151c\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.198550 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-openstack-edpm-ipam\") pod \"babb8f49-4a74-45a2-8d2f-af714429151c\" (UID: \"babb8f49-4a74-45a2-8d2f-af714429151c\") " Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.207344 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/babb8f49-4a74-45a2-8d2f-af714429151c-kube-api-access-p49nx" (OuterVolumeSpecName: "kube-api-access-p49nx") pod "babb8f49-4a74-45a2-8d2f-af714429151c" (UID: "babb8f49-4a74-45a2-8d2f-af714429151c"). InnerVolumeSpecName "kube-api-access-p49nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.303299 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p49nx\" (UniqueName: \"kubernetes.io/projected/babb8f49-4a74-45a2-8d2f-af714429151c-kube-api-access-p49nx\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.329221 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "babb8f49-4a74-45a2-8d2f-af714429151c" (UID: "babb8f49-4a74-45a2-8d2f-af714429151c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.335874 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "babb8f49-4a74-45a2-8d2f-af714429151c" (UID: "babb8f49-4a74-45a2-8d2f-af714429151c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.338065 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "babb8f49-4a74-45a2-8d2f-af714429151c" (UID: "babb8f49-4a74-45a2-8d2f-af714429151c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.341066 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-config" (OuterVolumeSpecName: "config") pod "babb8f49-4a74-45a2-8d2f-af714429151c" (UID: "babb8f49-4a74-45a2-8d2f-af714429151c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.360988 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "babb8f49-4a74-45a2-8d2f-af714429151c" (UID: "babb8f49-4a74-45a2-8d2f-af714429151c"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.373579 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "babb8f49-4a74-45a2-8d2f-af714429151c" (UID: "babb8f49-4a74-45a2-8d2f-af714429151c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.408530 4878 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.408831 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.408892 4878 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.408959 4878 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.409012 4878 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-config\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.409064 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/babb8f49-4a74-45a2-8d2f-af714429151c-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.818689 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.818690 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b8b6c8f9-k99xz" event={"ID":"babb8f49-4a74-45a2-8d2f-af714429151c","Type":"ContainerDied","Data":"7391a399c3669603bb782bcfd068223332ce982587f95ae3c7f2f3ad2452092d"} Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.818761 4878 scope.go:117] "RemoveContainer" containerID="38a97459434489e3de11ef65e61a08e7eb282c93eef1f62679f42f7971fae375" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.860658 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b8b6c8f9-k99xz"] Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.860699 4878 scope.go:117] "RemoveContainer" containerID="7a4c7f640d6f7de677097cb7c04dc8d549b52405ec420980b1049b7a639f7b9f" Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.903764 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78b8b6c8f9-k99xz"] Dec 02 18:42:40 crc kubenswrapper[4878]: I1202 18:42:40.969570 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="babb8f49-4a74-45a2-8d2f-af714429151c" path="/var/lib/kubelet/pods/babb8f49-4a74-45a2-8d2f-af714429151c/volumes" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.455602 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-757798f686-n6x4p"] Dec 02 18:42:41 crc kubenswrapper[4878]: E1202 18:42:41.456193 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de0b145-008a-4a41-aa97-cb01f30d946f" containerName="heat-db-sync" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456210 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de0b145-008a-4a41-aa97-cb01f30d946f" containerName="heat-db-sync" Dec 02 18:42:41 crc kubenswrapper[4878]: E1202 18:42:41.456229 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerName="dnsmasq-dns" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456249 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerName="dnsmasq-dns" Dec 02 18:42:41 crc kubenswrapper[4878]: E1202 18:42:41.456272 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babb8f49-4a74-45a2-8d2f-af714429151c" containerName="init" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456279 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="babb8f49-4a74-45a2-8d2f-af714429151c" containerName="init" Dec 02 18:42:41 crc kubenswrapper[4878]: E1202 18:42:41.456294 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babb8f49-4a74-45a2-8d2f-af714429151c" containerName="dnsmasq-dns" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456301 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="babb8f49-4a74-45a2-8d2f-af714429151c" containerName="dnsmasq-dns" Dec 02 18:42:41 crc kubenswrapper[4878]: E1202 18:42:41.456308 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerName="init" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456313 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerName="init" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456539 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="babb8f49-4a74-45a2-8d2f-af714429151c" containerName="dnsmasq-dns" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456558 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de0b145-008a-4a41-aa97-cb01f30d946f" containerName="heat-db-sync" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.456572 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e92901-83f8-42f0-8bfb-ff6cb1805d81" containerName="dnsmasq-dns" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.463905 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.497312 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-757798f686-n6x4p"] Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.543845 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-f8bcd7564-vh2kf"] Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.546500 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.549999 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-combined-ca-bundle\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.550056 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-config-data-custom\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.550111 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmkzn\" (UniqueName: \"kubernetes.io/projected/a845b7d1-1f82-4048-8bc9-56611020bcec-kube-api-access-nmkzn\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.550167 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-config-data\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.576464 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f8bcd7564-vh2kf"] Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.656771 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-internal-tls-certs\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.656840 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmkzn\" (UniqueName: \"kubernetes.io/projected/a845b7d1-1f82-4048-8bc9-56611020bcec-kube-api-access-nmkzn\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.656866 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7c7z\" (UniqueName: \"kubernetes.io/projected/88be3860-e9da-4f2b-baff-142f994127c4-kube-api-access-s7c7z\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.656934 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-config-data-custom\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.656993 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-public-tls-certs\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.657021 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-config-data\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.657275 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-config-data\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.657439 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-combined-ca-bundle\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.657506 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-config-data-custom\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.657540 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-combined-ca-bundle\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.699873 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-config-data\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.706296 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmkzn\" (UniqueName: \"kubernetes.io/projected/a845b7d1-1f82-4048-8bc9-56611020bcec-kube-api-access-nmkzn\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.711829 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7546bcc475-2zz2q"] Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.714793 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-config-data-custom\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.725712 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.751112 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a845b7d1-1f82-4048-8bc9-56611020bcec-combined-ca-bundle\") pod \"heat-engine-757798f686-n6x4p\" (UID: \"a845b7d1-1f82-4048-8bc9-56611020bcec\") " pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.772764 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-combined-ca-bundle\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.772973 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-internal-tls-certs\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.773026 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7c7z\" (UniqueName: \"kubernetes.io/projected/88be3860-e9da-4f2b-baff-142f994127c4-kube-api-access-s7c7z\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.773093 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-config-data-custom\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.773168 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-public-tls-certs\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.780104 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-internal-tls-certs\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.785973 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7546bcc475-2zz2q"] Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.800481 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-config-data\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.800585 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7c7z\" (UniqueName: \"kubernetes.io/projected/88be3860-e9da-4f2b-baff-142f994127c4-kube-api-access-s7c7z\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.822373 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-config-data\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.822950 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-config-data-custom\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.823536 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-combined-ca-bundle\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.857091 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88be3860-e9da-4f2b-baff-142f994127c4-public-tls-certs\") pod \"heat-api-f8bcd7564-vh2kf\" (UID: \"88be3860-e9da-4f2b-baff-142f994127c4\") " pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.872529 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.909972 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drlzn\" (UniqueName: \"kubernetes.io/projected/a37933a9-8ddf-406e-9f40-b79fba21d5b5-kube-api-access-drlzn\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.910289 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-config-data\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.910393 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-combined-ca-bundle\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.910436 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-public-tls-certs\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.910484 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-config-data-custom\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.910500 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-internal-tls-certs\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:41 crc kubenswrapper[4878]: I1202 18:42:41.925175 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.015593 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-combined-ca-bundle\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.015685 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-public-tls-certs\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.015804 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-config-data-custom\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.015823 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-internal-tls-certs\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.016057 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drlzn\" (UniqueName: \"kubernetes.io/projected/a37933a9-8ddf-406e-9f40-b79fba21d5b5-kube-api-access-drlzn\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.016137 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-config-data\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.023753 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-combined-ca-bundle\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.024718 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-config-data-custom\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.026691 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-config-data\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.028215 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-public-tls-certs\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.037310 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a37933a9-8ddf-406e-9f40-b79fba21d5b5-internal-tls-certs\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.057444 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drlzn\" (UniqueName: \"kubernetes.io/projected/a37933a9-8ddf-406e-9f40-b79fba21d5b5-kube-api-access-drlzn\") pod \"heat-cfnapi-7546bcc475-2zz2q\" (UID: \"a37933a9-8ddf-406e-9f40-b79fba21d5b5\") " pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.118385 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.427675 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-757798f686-n6x4p"] Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.566475 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f8bcd7564-vh2kf"] Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.911080 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-757798f686-n6x4p" event={"ID":"a845b7d1-1f82-4048-8bc9-56611020bcec","Type":"ContainerStarted","Data":"af8688a11e346abc371b862e46a95d3cf57c5c841f2bb912881845116017457e"} Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.911443 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-757798f686-n6x4p" event={"ID":"a845b7d1-1f82-4048-8bc9-56611020bcec","Type":"ContainerStarted","Data":"492fbeeea4e260ed35872be8582e788c45f443c4766649e034b3224315df036c"} Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.913334 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.916549 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f8bcd7564-vh2kf" event={"ID":"88be3860-e9da-4f2b-baff-142f994127c4","Type":"ContainerStarted","Data":"10eba7107ce7e0a3c1906df44fb9bab205a049f118dcb93e6499e67574ff29c6"} Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.952417 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-757798f686-n6x4p" podStartSLOduration=1.952390328 podStartE2EDuration="1.952390328s" podCreationTimestamp="2025-12-02 18:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:42:42.946072381 +0000 UTC m=+1672.635691262" watchObservedRunningTime="2025-12-02 18:42:42.952390328 +0000 UTC m=+1672.642009209" Dec 02 18:42:42 crc kubenswrapper[4878]: W1202 18:42:42.973506 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda37933a9_8ddf_406e_9f40_b79fba21d5b5.slice/crio-55446bde98c618a9bc36f74c8a819120d9a423c1ecaa509762b6713da383449a WatchSource:0}: Error finding container 55446bde98c618a9bc36f74c8a819120d9a423c1ecaa509762b6713da383449a: Status 404 returned error can't find the container with id 55446bde98c618a9bc36f74c8a819120d9a423c1ecaa509762b6713da383449a Dec 02 18:42:42 crc kubenswrapper[4878]: I1202 18:42:42.981178 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7546bcc475-2zz2q"] Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.621503 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8"] Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.623622 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.626132 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.626324 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.635212 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.635395 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.672723 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8"] Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.727916 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.727985 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxqw4\" (UniqueName: \"kubernetes.io/projected/8097a01b-4fab-4bac-839d-a1f937120beb-kube-api-access-wxqw4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.728055 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.728148 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.831861 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxqw4\" (UniqueName: \"kubernetes.io/projected/8097a01b-4fab-4bac-839d-a1f937120beb-kube-api-access-wxqw4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.831958 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.832043 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.832209 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.841212 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.841677 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.857544 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxqw4\" (UniqueName: \"kubernetes.io/projected/8097a01b-4fab-4bac-839d-a1f937120beb-kube-api-access-wxqw4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.858542 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.969764 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:42:43 crc kubenswrapper[4878]: I1202 18:42:43.988028 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7546bcc475-2zz2q" event={"ID":"a37933a9-8ddf-406e-9f40-b79fba21d5b5","Type":"ContainerStarted","Data":"55446bde98c618a9bc36f74c8a819120d9a423c1ecaa509762b6713da383449a"} Dec 02 18:42:46 crc kubenswrapper[4878]: W1202 18:42:46.009480 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8097a01b_4fab_4bac_839d_a1f937120beb.slice/crio-0286484f7c46a55aff2e133d6f26987d460496563613719d29673a7f8714b7ed WatchSource:0}: Error finding container 0286484f7c46a55aff2e133d6f26987d460496563613719d29673a7f8714b7ed: Status 404 returned error can't find the container with id 0286484f7c46a55aff2e133d6f26987d460496563613719d29673a7f8714b7ed Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.013978 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8"] Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.029622 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" event={"ID":"8097a01b-4fab-4bac-839d-a1f937120beb","Type":"ContainerStarted","Data":"0286484f7c46a55aff2e133d6f26987d460496563613719d29673a7f8714b7ed"} Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.032390 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f8bcd7564-vh2kf" event={"ID":"88be3860-e9da-4f2b-baff-142f994127c4","Type":"ContainerStarted","Data":"fa31912c966624de5036481d9617c7e7f5c5cb28769465c9d952f8008594ac04"} Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.032600 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.034452 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7546bcc475-2zz2q" event={"ID":"a37933a9-8ddf-406e-9f40-b79fba21d5b5","Type":"ContainerStarted","Data":"6c8a8cc3c156d4a710ac6c5b7888b58336e3bdfe1a5a255278362bacc1faec3c"} Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.035631 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.091071 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-f8bcd7564-vh2kf" podStartSLOduration=2.563633286 podStartE2EDuration="5.091044079s" podCreationTimestamp="2025-12-02 18:42:41 +0000 UTC" firstStartedPulling="2025-12-02 18:42:42.576871462 +0000 UTC m=+1672.266490343" lastFinishedPulling="2025-12-02 18:42:45.104282255 +0000 UTC m=+1674.793901136" observedRunningTime="2025-12-02 18:42:46.074677948 +0000 UTC m=+1675.764296839" watchObservedRunningTime="2025-12-02 18:42:46.091044079 +0000 UTC m=+1675.780662970" Dec 02 18:42:46 crc kubenswrapper[4878]: I1202 18:42:46.135361 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7546bcc475-2zz2q" podStartSLOduration=3.006888287 podStartE2EDuration="5.135331821s" podCreationTimestamp="2025-12-02 18:42:41 +0000 UTC" firstStartedPulling="2025-12-02 18:42:42.976995357 +0000 UTC m=+1672.666614238" lastFinishedPulling="2025-12-02 18:42:45.105438891 +0000 UTC m=+1674.795057772" observedRunningTime="2025-12-02 18:42:46.113803799 +0000 UTC m=+1675.803422680" watchObservedRunningTime="2025-12-02 18:42:46.135331821 +0000 UTC m=+1675.824950702" Dec 02 18:42:53 crc kubenswrapper[4878]: I1202 18:42:53.741965 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:42:53 crc kubenswrapper[4878]: I1202 18:42:53.742635 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:42:53 crc kubenswrapper[4878]: I1202 18:42:53.742698 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:42:53 crc kubenswrapper[4878]: I1202 18:42:53.743963 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:42:53 crc kubenswrapper[4878]: I1202 18:42:53.744037 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" gracePeriod=600 Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.228499 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" exitCode=0 Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.228726 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f"} Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.228822 4878 scope.go:117] "RemoveContainer" containerID="f262fb1e8290073f98cb5506d7d41d0ed3eb91d64ad36acc8496dd8b9fa35544" Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.651815 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7546bcc475-2zz2q" Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.661267 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-f8bcd7564-vh2kf" Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.823149 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6b48666bb6-kckkf"] Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.823395 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" podUID="2469eb4b-9de3-45f0-bc72-0a8add16fa57" containerName="heat-cfnapi" containerID="cri-o://6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf" gracePeriod=60 Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.847045 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6464b89f4f-f5xbb"] Dec 02 18:42:54 crc kubenswrapper[4878]: I1202 18:42:54.857424 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6464b89f4f-f5xbb" podUID="a072f732-97bd-4297-923d-beea0ac36e2a" containerName="heat-api" containerID="cri-o://2e323d7b7c7f0e67c7e05473a00d49fa9b9b8415711d196da7bb1cf69bf9642e" gracePeriod=60 Dec 02 18:42:56 crc kubenswrapper[4878]: I1202 18:42:56.264394 4878 generic.go:334] "Generic (PLEG): container finished" podID="eedb789b-6bed-4a82-82c1-977a633ed304" containerID="d4de0d6fad654ab5f0827246c93c3efa182f052ba8d83c5e52763353bad93b4d" exitCode=0 Dec 02 18:42:56 crc kubenswrapper[4878]: I1202 18:42:56.264477 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eedb789b-6bed-4a82-82c1-977a633ed304","Type":"ContainerDied","Data":"d4de0d6fad654ab5f0827246c93c3efa182f052ba8d83c5e52763353bad93b4d"} Dec 02 18:42:56 crc kubenswrapper[4878]: I1202 18:42:56.267815 4878 generic.go:334] "Generic (PLEG): container finished" podID="3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da" containerID="5816ce723c0e2ba26944dcb41e95a33e14a9b0732e101dcbf673de732b0974f1" exitCode=0 Dec 02 18:42:56 crc kubenswrapper[4878]: I1202 18:42:56.267855 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da","Type":"ContainerDied","Data":"5816ce723c0e2ba26944dcb41e95a33e14a9b0732e101dcbf673de732b0974f1"} Dec 02 18:42:57 crc kubenswrapper[4878]: I1202 18:42:57.055693 4878 scope.go:117] "RemoveContainer" containerID="bd37ed7af3c342730a3178bc613dab9f0723342a4e89cdeb62d8567806f3f9e2" Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.218852 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6464b89f4f-f5xbb" podUID="a072f732-97bd-4297-923d-beea0ac36e2a" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.221:8004/healthcheck\": read tcp 10.217.0.2:42950->10.217.0.221:8004: read: connection reset by peer" Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.300197 4878 generic.go:334] "Generic (PLEG): container finished" podID="a072f732-97bd-4297-923d-beea0ac36e2a" containerID="2e323d7b7c7f0e67c7e05473a00d49fa9b9b8415711d196da7bb1cf69bf9642e" exitCode=0 Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.300260 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6464b89f4f-f5xbb" event={"ID":"a072f732-97bd-4297-923d-beea0ac36e2a","Type":"ContainerDied","Data":"2e323d7b7c7f0e67c7e05473a00d49fa9b9b8415711d196da7bb1cf69bf9642e"} Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.335900 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" podUID="2469eb4b-9de3-45f0-bc72-0a8add16fa57" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.222:8000/healthcheck\": read tcp 10.217.0.2:46316->10.217.0.222:8000: read: connection reset by peer" Dec 02 18:42:58 crc kubenswrapper[4878]: E1202 18:42:58.480505 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.642892 4878 scope.go:117] "RemoveContainer" containerID="2055b89dc9cbea43acc6ba991b937ee221b33c2ada58b5a3e0dc3d18f0d16827" Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.719721 4878 scope.go:117] "RemoveContainer" containerID="3b563374a340d5afa7894020a9e8459ec6324fd14c0e17d0b2d8285115781d89" Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.732410 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.919599 4878 scope.go:117] "RemoveContainer" containerID="3ee110d7786b3994c541df382097cd29ed6bf3fbdb3b04c89a2e2c5b959c0266" Dec 02 18:42:58 crc kubenswrapper[4878]: I1202 18:42:58.996592 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.131441 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data\") pod \"a072f732-97bd-4297-923d-beea0ac36e2a\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.131523 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-internal-tls-certs\") pod \"a072f732-97bd-4297-923d-beea0ac36e2a\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.131569 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-combined-ca-bundle\") pod \"a072f732-97bd-4297-923d-beea0ac36e2a\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.131748 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data-custom\") pod \"a072f732-97bd-4297-923d-beea0ac36e2a\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.131874 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcwhl\" (UniqueName: \"kubernetes.io/projected/a072f732-97bd-4297-923d-beea0ac36e2a-kube-api-access-kcwhl\") pod \"a072f732-97bd-4297-923d-beea0ac36e2a\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.131992 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-public-tls-certs\") pod \"a072f732-97bd-4297-923d-beea0ac36e2a\" (UID: \"a072f732-97bd-4297-923d-beea0ac36e2a\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.145078 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a072f732-97bd-4297-923d-beea0ac36e2a-kube-api-access-kcwhl" (OuterVolumeSpecName: "kube-api-access-kcwhl") pod "a072f732-97bd-4297-923d-beea0ac36e2a" (UID: "a072f732-97bd-4297-923d-beea0ac36e2a"). InnerVolumeSpecName "kube-api-access-kcwhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.225449 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.233717 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a072f732-97bd-4297-923d-beea0ac36e2a" (UID: "a072f732-97bd-4297-923d-beea0ac36e2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.259061 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.259106 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcwhl\" (UniqueName: \"kubernetes.io/projected/a072f732-97bd-4297-923d-beea0ac36e2a-kube-api-access-kcwhl\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.360250 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-combined-ca-bundle\") pod \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.360292 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-public-tls-certs\") pod \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.360324 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data\") pod \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.360361 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data-custom\") pod \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.360568 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-internal-tls-certs\") pod \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.360615 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7btz\" (UniqueName: \"kubernetes.io/projected/2469eb4b-9de3-45f0-bc72-0a8add16fa57-kube-api-access-r7btz\") pod \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\" (UID: \"2469eb4b-9de3-45f0-bc72-0a8add16fa57\") " Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.369521 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2469eb4b-9de3-45f0-bc72-0a8add16fa57" (UID: "2469eb4b-9de3-45f0-bc72-0a8add16fa57"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.391585 4878 generic.go:334] "Generic (PLEG): container finished" podID="2469eb4b-9de3-45f0-bc72-0a8add16fa57" containerID="6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf" exitCode=0 Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.391735 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" event={"ID":"2469eb4b-9de3-45f0-bc72-0a8add16fa57","Type":"ContainerDied","Data":"6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf"} Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.391768 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" event={"ID":"2469eb4b-9de3-45f0-bc72-0a8add16fa57","Type":"ContainerDied","Data":"fc8a6b7d1febfe59b30b6c9d7260c522e3fec44976a20645f0cf62fe5bd839ad"} Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.391794 4878 scope.go:117] "RemoveContainer" containerID="6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.391995 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b48666bb6-kckkf" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.439658 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:42:59 crc kubenswrapper[4878]: E1202 18:42:59.439996 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.444434 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2469eb4b-9de3-45f0-bc72-0a8add16fa57-kube-api-access-r7btz" (OuterVolumeSpecName: "kube-api-access-r7btz") pod "2469eb4b-9de3-45f0-bc72-0a8add16fa57" (UID: "2469eb4b-9de3-45f0-bc72-0a8add16fa57"). InnerVolumeSpecName "kube-api-access-r7btz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.445215 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6464b89f4f-f5xbb" event={"ID":"a072f732-97bd-4297-923d-beea0ac36e2a","Type":"ContainerDied","Data":"26090dca63a9373d660ad6f688970c5f3d96be6aebb0212ec252dbea773255a3"} Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.445340 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6464b89f4f-f5xbb" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.473096 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7btz\" (UniqueName: \"kubernetes.io/projected/2469eb4b-9de3-45f0-bc72-0a8add16fa57-kube-api-access-r7btz\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.473124 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.477462 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a072f732-97bd-4297-923d-beea0ac36e2a" (UID: "a072f732-97bd-4297-923d-beea0ac36e2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.508308 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eedb789b-6bed-4a82-82c1-977a633ed304","Type":"ContainerStarted","Data":"1482291506f86bc2a5ea156e184ae88a49eb914115bbf886a4fb97eb1c878d96"} Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.508665 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.534446 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da","Type":"ContainerStarted","Data":"9f76e01ac76c32f9824e27e5c603a1fe8b7db29ecf15957978e13a636efee446"} Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.534781 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.566202 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a072f732-97bd-4297-923d-beea0ac36e2a" (UID: "a072f732-97bd-4297-923d-beea0ac36e2a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.577190 4878 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.577537 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.588681 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.588660304 podStartE2EDuration="48.588660304s" podCreationTimestamp="2025-12-02 18:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:42:59.560064864 +0000 UTC m=+1689.249683745" watchObservedRunningTime="2025-12-02 18:42:59.588660304 +0000 UTC m=+1689.278279185" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.602452 4878 scope.go:117] "RemoveContainer" containerID="6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf" Dec 02 18:42:59 crc kubenswrapper[4878]: E1202 18:42:59.603412 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf\": container with ID starting with 6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf not found: ID does not exist" containerID="6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.603445 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf"} err="failed to get container status \"6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf\": rpc error: code = NotFound desc = could not find container \"6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf\": container with ID starting with 6ad7801a711888d5ee15012723c0a2cce3a65c90471945103d35dcd3131347bf not found: ID does not exist" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.603482 4878 scope.go:117] "RemoveContainer" containerID="2e323d7b7c7f0e67c7e05473a00d49fa9b9b8415711d196da7bb1cf69bf9642e" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.608728 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.608702966 podStartE2EDuration="39.608702966s" podCreationTimestamp="2025-12-02 18:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:42:59.587940541 +0000 UTC m=+1689.277559422" watchObservedRunningTime="2025-12-02 18:42:59.608702966 +0000 UTC m=+1689.298321847" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.654101 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a072f732-97bd-4297-923d-beea0ac36e2a" (UID: "a072f732-97bd-4297-923d-beea0ac36e2a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.681994 4878 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.692363 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data" (OuterVolumeSpecName: "config-data") pod "a072f732-97bd-4297-923d-beea0ac36e2a" (UID: "a072f732-97bd-4297-923d-beea0ac36e2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.739956 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data" (OuterVolumeSpecName: "config-data") pod "2469eb4b-9de3-45f0-bc72-0a8add16fa57" (UID: "2469eb4b-9de3-45f0-bc72-0a8add16fa57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.747300 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2469eb4b-9de3-45f0-bc72-0a8add16fa57" (UID: "2469eb4b-9de3-45f0-bc72-0a8add16fa57"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.755432 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2469eb4b-9de3-45f0-bc72-0a8add16fa57" (UID: "2469eb4b-9de3-45f0-bc72-0a8add16fa57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.776117 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2469eb4b-9de3-45f0-bc72-0a8add16fa57" (UID: "2469eb4b-9de3-45f0-bc72-0a8add16fa57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.785053 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072f732-97bd-4297-923d-beea0ac36e2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.785091 4878 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.785104 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.785112 4878 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.785121 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2469eb4b-9de3-45f0-bc72-0a8add16fa57-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.916022 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6464b89f4f-f5xbb"] Dec 02 18:42:59 crc kubenswrapper[4878]: I1202 18:42:59.929884 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6464b89f4f-f5xbb"] Dec 02 18:43:00 crc kubenswrapper[4878]: I1202 18:43:00.041345 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6b48666bb6-kckkf"] Dec 02 18:43:00 crc kubenswrapper[4878]: I1202 18:43:00.059540 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6b48666bb6-kckkf"] Dec 02 18:43:00 crc kubenswrapper[4878]: I1202 18:43:00.551840 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" event={"ID":"8097a01b-4fab-4bac-839d-a1f937120beb","Type":"ContainerStarted","Data":"371068dc5e79d0575afd10a9e11ef1007d3895d3c6fe81c2a0563889ee8b8277"} Dec 02 18:43:00 crc kubenswrapper[4878]: I1202 18:43:00.578208 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" podStartSLOduration=4.869033957 podStartE2EDuration="17.57818246s" podCreationTimestamp="2025-12-02 18:42:43 +0000 UTC" firstStartedPulling="2025-12-02 18:42:46.01167756 +0000 UTC m=+1675.701296451" lastFinishedPulling="2025-12-02 18:42:58.720826073 +0000 UTC m=+1688.410444954" observedRunningTime="2025-12-02 18:43:00.569107047 +0000 UTC m=+1690.258725938" watchObservedRunningTime="2025-12-02 18:43:00.57818246 +0000 UTC m=+1690.267801341" Dec 02 18:43:00 crc kubenswrapper[4878]: I1202 18:43:00.961198 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2469eb4b-9de3-45f0-bc72-0a8add16fa57" path="/var/lib/kubelet/pods/2469eb4b-9de3-45f0-bc72-0a8add16fa57/volumes" Dec 02 18:43:00 crc kubenswrapper[4878]: I1202 18:43:00.961808 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a072f732-97bd-4297-923d-beea0ac36e2a" path="/var/lib/kubelet/pods/a072f732-97bd-4297-923d-beea0ac36e2a/volumes" Dec 02 18:43:01 crc kubenswrapper[4878]: I1202 18:43:01.915562 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-757798f686-n6x4p" Dec 02 18:43:01 crc kubenswrapper[4878]: I1202 18:43:01.977964 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-78bfbd4977-mcqqx"] Dec 02 18:43:01 crc kubenswrapper[4878]: I1202 18:43:01.978465 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-78bfbd4977-mcqqx" podUID="242dcc1d-40de-4f68-8d3f-3dbe4b813506" containerName="heat-engine" containerID="cri-o://4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319" gracePeriod=60 Dec 02 18:43:09 crc kubenswrapper[4878]: E1202 18:43:09.161708 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:43:09 crc kubenswrapper[4878]: E1202 18:43:09.163820 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:43:09 crc kubenswrapper[4878]: E1202 18:43:09.165663 4878 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 02 18:43:09 crc kubenswrapper[4878]: E1202 18:43:09.165724 4878 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-78bfbd4977-mcqqx" podUID="242dcc1d-40de-4f68-8d3f-3dbe4b813506" containerName="heat-engine" Dec 02 18:43:10 crc kubenswrapper[4878]: I1202 18:43:10.717303 4878 generic.go:334] "Generic (PLEG): container finished" podID="8097a01b-4fab-4bac-839d-a1f937120beb" containerID="371068dc5e79d0575afd10a9e11ef1007d3895d3c6fe81c2a0563889ee8b8277" exitCode=0 Dec 02 18:43:10 crc kubenswrapper[4878]: I1202 18:43:10.718610 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" event={"ID":"8097a01b-4fab-4bac-839d-a1f937120beb","Type":"ContainerDied","Data":"371068dc5e79d0575afd10a9e11ef1007d3895d3c6fe81c2a0563889ee8b8277"} Dec 02 18:43:10 crc kubenswrapper[4878]: I1202 18:43:10.955380 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 18:43:10 crc kubenswrapper[4878]: I1202 18:43:10.983443 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-dfqz2"] Dec 02 18:43:10 crc kubenswrapper[4878]: I1202 18:43:10.997257 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-dfqz2"] Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.238327 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-8jpmg"] Dec 02 18:43:11 crc kubenswrapper[4878]: E1202 18:43:11.239212 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2469eb4b-9de3-45f0-bc72-0a8add16fa57" containerName="heat-cfnapi" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.239250 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2469eb4b-9de3-45f0-bc72-0a8add16fa57" containerName="heat-cfnapi" Dec 02 18:43:11 crc kubenswrapper[4878]: E1202 18:43:11.239291 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a072f732-97bd-4297-923d-beea0ac36e2a" containerName="heat-api" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.239308 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a072f732-97bd-4297-923d-beea0ac36e2a" containerName="heat-api" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.239644 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="2469eb4b-9de3-45f0-bc72-0a8add16fa57" containerName="heat-cfnapi" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.239686 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="a072f732-97bd-4297-923d-beea0ac36e2a" containerName="heat-api" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.241078 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.252047 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.268514 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8jpmg"] Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.444597 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/e658259f-368d-4571-a236-2c9bd3c3d9c6-kube-api-access-ncm6v\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.444655 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-scripts\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.444694 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-config-data\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.444759 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-combined-ca-bundle\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.546804 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-combined-ca-bundle\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.547424 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/e658259f-368d-4571-a236-2c9bd3c3d9c6-kube-api-access-ncm6v\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.547481 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-scripts\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.547539 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-config-data\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.553824 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-combined-ca-bundle\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.554273 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-config-data\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.557809 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-scripts\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.584371 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/e658259f-368d-4571-a236-2c9bd3c3d9c6-kube-api-access-ncm6v\") pod \"aodh-db-sync-8jpmg\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.599577 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.878806 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 18:43:11 crc kubenswrapper[4878]: I1202 18:43:11.940251 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:43:11 crc kubenswrapper[4878]: E1202 18:43:11.940484 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.457908 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.622779 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-repo-setup-combined-ca-bundle\") pod \"8097a01b-4fab-4bac-839d-a1f937120beb\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.622874 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-ssh-key\") pod \"8097a01b-4fab-4bac-839d-a1f937120beb\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.622945 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxqw4\" (UniqueName: \"kubernetes.io/projected/8097a01b-4fab-4bac-839d-a1f937120beb-kube-api-access-wxqw4\") pod \"8097a01b-4fab-4bac-839d-a1f937120beb\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.623041 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-inventory\") pod \"8097a01b-4fab-4bac-839d-a1f937120beb\" (UID: \"8097a01b-4fab-4bac-839d-a1f937120beb\") " Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.641112 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8097a01b-4fab-4bac-839d-a1f937120beb" (UID: "8097a01b-4fab-4bac-839d-a1f937120beb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.641363 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8097a01b-4fab-4bac-839d-a1f937120beb-kube-api-access-wxqw4" (OuterVolumeSpecName: "kube-api-access-wxqw4") pod "8097a01b-4fab-4bac-839d-a1f937120beb" (UID: "8097a01b-4fab-4bac-839d-a1f937120beb"). InnerVolumeSpecName "kube-api-access-wxqw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.687339 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8097a01b-4fab-4bac-839d-a1f937120beb" (UID: "8097a01b-4fab-4bac-839d-a1f937120beb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.690489 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-inventory" (OuterVolumeSpecName: "inventory") pod "8097a01b-4fab-4bac-839d-a1f937120beb" (UID: "8097a01b-4fab-4bac-839d-a1f937120beb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:12 crc kubenswrapper[4878]: W1202 18:43:12.704931 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode658259f_368d_4571_a236_2c9bd3c3d9c6.slice/crio-ec28629f7a5d8ff3545d688a3c2b2e10d3ba5d9a11d39bf76f02e6bd6f193af8 WatchSource:0}: Error finding container ec28629f7a5d8ff3545d688a3c2b2e10d3ba5d9a11d39bf76f02e6bd6f193af8: Status 404 returned error can't find the container with id ec28629f7a5d8ff3545d688a3c2b2e10d3ba5d9a11d39bf76f02e6bd6f193af8 Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.721781 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8jpmg"] Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.728781 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.728817 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxqw4\" (UniqueName: \"kubernetes.io/projected/8097a01b-4fab-4bac-839d-a1f937120beb-kube-api-access-wxqw4\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.728831 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.728869 4878 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8097a01b-4fab-4bac-839d-a1f937120beb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.752974 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8jpmg" event={"ID":"e658259f-368d-4571-a236-2c9bd3c3d9c6","Type":"ContainerStarted","Data":"ec28629f7a5d8ff3545d688a3c2b2e10d3ba5d9a11d39bf76f02e6bd6f193af8"} Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.755288 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" event={"ID":"8097a01b-4fab-4bac-839d-a1f937120beb","Type":"ContainerDied","Data":"0286484f7c46a55aff2e133d6f26987d460496563613719d29673a7f8714b7ed"} Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.755402 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0286484f7c46a55aff2e133d6f26987d460496563613719d29673a7f8714b7ed" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.755526 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.831323 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n"] Dec 02 18:43:12 crc kubenswrapper[4878]: E1202 18:43:12.832264 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8097a01b-4fab-4bac-839d-a1f937120beb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.832286 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8097a01b-4fab-4bac-839d-a1f937120beb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.832654 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8097a01b-4fab-4bac-839d-a1f937120beb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.833845 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.841746 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.842404 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.842411 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.842521 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.890876 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n"] Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.937909 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.938032 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.939035 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpvj2\" (UniqueName: \"kubernetes.io/projected/4c58063d-9ff4-43dd-9dec-17d14a541013-kube-api-access-vpvj2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:12 crc kubenswrapper[4878]: I1202 18:43:12.962756 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e54a432-26a0-46eb-b4b2-8f9cc141f4ff" path="/var/lib/kubelet/pods/7e54a432-26a0-46eb-b4b2-8f9cc141f4ff/volumes" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.057036 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.057371 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.057473 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpvj2\" (UniqueName: \"kubernetes.io/projected/4c58063d-9ff4-43dd-9dec-17d14a541013-kube-api-access-vpvj2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.075677 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.076327 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.091353 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpvj2\" (UniqueName: \"kubernetes.io/projected/4c58063d-9ff4-43dd-9dec-17d14a541013-kube-api-access-vpvj2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgm9n\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.165349 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:13 crc kubenswrapper[4878]: I1202 18:43:13.870954 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n"] Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.798631 4878 generic.go:334] "Generic (PLEG): container finished" podID="242dcc1d-40de-4f68-8d3f-3dbe4b813506" containerID="4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319" exitCode=0 Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.798854 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78bfbd4977-mcqqx" event={"ID":"242dcc1d-40de-4f68-8d3f-3dbe4b813506","Type":"ContainerDied","Data":"4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319"} Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.798912 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-78bfbd4977-mcqqx" event={"ID":"242dcc1d-40de-4f68-8d3f-3dbe4b813506","Type":"ContainerDied","Data":"045c4f2230e46d222127c554ff566b5386a7f0451399a0fb5520dbc2517c6497"} Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.798930 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="045c4f2230e46d222127c554ff566b5386a7f0451399a0fb5520dbc2517c6497" Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.801456 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" event={"ID":"4c58063d-9ff4-43dd-9dec-17d14a541013","Type":"ContainerStarted","Data":"55279706e3f11530edc4b8d4b56742040ca11eafc432a376770984990fb391bb"} Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.801497 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" event={"ID":"4c58063d-9ff4-43dd-9dec-17d14a541013","Type":"ContainerStarted","Data":"fe392cfe6bcc713bb2c243e257014faab71952c4dfc3764ba99f2ef25493b0fa"} Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.830374 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" podStartSLOduration=2.385844206 podStartE2EDuration="2.830349687s" podCreationTimestamp="2025-12-02 18:43:12 +0000 UTC" firstStartedPulling="2025-12-02 18:43:13.886628145 +0000 UTC m=+1703.576247026" lastFinishedPulling="2025-12-02 18:43:14.331133626 +0000 UTC m=+1704.020752507" observedRunningTime="2025-12-02 18:43:14.823134823 +0000 UTC m=+1704.512753704" watchObservedRunningTime="2025-12-02 18:43:14.830349687 +0000 UTC m=+1704.519968568" Dec 02 18:43:14 crc kubenswrapper[4878]: I1202 18:43:14.911741 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.015571 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data-custom\") pod \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.017450 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-combined-ca-bundle\") pod \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.049403 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "242dcc1d-40de-4f68-8d3f-3dbe4b813506" (UID: "242dcc1d-40de-4f68-8d3f-3dbe4b813506"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.079777 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "242dcc1d-40de-4f68-8d3f-3dbe4b813506" (UID: "242dcc1d-40de-4f68-8d3f-3dbe4b813506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.120005 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data\") pod \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.120058 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxfc4\" (UniqueName: \"kubernetes.io/projected/242dcc1d-40de-4f68-8d3f-3dbe4b813506-kube-api-access-qxfc4\") pod \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\" (UID: \"242dcc1d-40de-4f68-8d3f-3dbe4b813506\") " Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.121149 4878 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.121195 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.124857 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242dcc1d-40de-4f68-8d3f-3dbe4b813506-kube-api-access-qxfc4" (OuterVolumeSpecName: "kube-api-access-qxfc4") pod "242dcc1d-40de-4f68-8d3f-3dbe4b813506" (UID: "242dcc1d-40de-4f68-8d3f-3dbe4b813506"). InnerVolumeSpecName "kube-api-access-qxfc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.191587 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data" (OuterVolumeSpecName: "config-data") pod "242dcc1d-40de-4f68-8d3f-3dbe4b813506" (UID: "242dcc1d-40de-4f68-8d3f-3dbe4b813506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.223676 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242dcc1d-40de-4f68-8d3f-3dbe4b813506-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.223715 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxfc4\" (UniqueName: \"kubernetes.io/projected/242dcc1d-40de-4f68-8d3f-3dbe4b813506-kube-api-access-qxfc4\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.832264 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-78bfbd4977-mcqqx" Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.876422 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-78bfbd4977-mcqqx"] Dec 02 18:43:15 crc kubenswrapper[4878]: I1202 18:43:15.894127 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-78bfbd4977-mcqqx"] Dec 02 18:43:16 crc kubenswrapper[4878]: I1202 18:43:16.954951 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242dcc1d-40de-4f68-8d3f-3dbe4b813506" path="/var/lib/kubelet/pods/242dcc1d-40de-4f68-8d3f-3dbe4b813506/volumes" Dec 02 18:43:17 crc kubenswrapper[4878]: I1202 18:43:17.871881 4878 generic.go:334] "Generic (PLEG): container finished" podID="4c58063d-9ff4-43dd-9dec-17d14a541013" containerID="55279706e3f11530edc4b8d4b56742040ca11eafc432a376770984990fb391bb" exitCode=0 Dec 02 18:43:17 crc kubenswrapper[4878]: I1202 18:43:17.871968 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" event={"ID":"4c58063d-9ff4-43dd-9dec-17d14a541013","Type":"ContainerDied","Data":"55279706e3f11530edc4b8d4b56742040ca11eafc432a376770984990fb391bb"} Dec 02 18:43:18 crc kubenswrapper[4878]: I1202 18:43:18.890333 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8jpmg" event={"ID":"e658259f-368d-4571-a236-2c9bd3c3d9c6","Type":"ContainerStarted","Data":"94a15ab3becf2b9eded108dfd0e7e355cceb6b5b42271f8118e7a77886aaa08e"} Dec 02 18:43:18 crc kubenswrapper[4878]: I1202 18:43:18.922269 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-8jpmg" podStartSLOduration=2.437407899 podStartE2EDuration="7.922247729s" podCreationTimestamp="2025-12-02 18:43:11 +0000 UTC" firstStartedPulling="2025-12-02 18:43:12.707561881 +0000 UTC m=+1702.397180762" lastFinishedPulling="2025-12-02 18:43:18.192401711 +0000 UTC m=+1707.882020592" observedRunningTime="2025-12-02 18:43:18.912568957 +0000 UTC m=+1708.602187838" watchObservedRunningTime="2025-12-02 18:43:18.922247729 +0000 UTC m=+1708.611866610" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.445331 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.582011 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpvj2\" (UniqueName: \"kubernetes.io/projected/4c58063d-9ff4-43dd-9dec-17d14a541013-kube-api-access-vpvj2\") pod \"4c58063d-9ff4-43dd-9dec-17d14a541013\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.582461 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-ssh-key\") pod \"4c58063d-9ff4-43dd-9dec-17d14a541013\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.582688 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-inventory\") pod \"4c58063d-9ff4-43dd-9dec-17d14a541013\" (UID: \"4c58063d-9ff4-43dd-9dec-17d14a541013\") " Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.594717 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c58063d-9ff4-43dd-9dec-17d14a541013-kube-api-access-vpvj2" (OuterVolumeSpecName: "kube-api-access-vpvj2") pod "4c58063d-9ff4-43dd-9dec-17d14a541013" (UID: "4c58063d-9ff4-43dd-9dec-17d14a541013"). InnerVolumeSpecName "kube-api-access-vpvj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.637209 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-inventory" (OuterVolumeSpecName: "inventory") pod "4c58063d-9ff4-43dd-9dec-17d14a541013" (UID: "4c58063d-9ff4-43dd-9dec-17d14a541013"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.645549 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c58063d-9ff4-43dd-9dec-17d14a541013" (UID: "4c58063d-9ff4-43dd-9dec-17d14a541013"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.686755 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpvj2\" (UniqueName: \"kubernetes.io/projected/4c58063d-9ff4-43dd-9dec-17d14a541013-kube-api-access-vpvj2\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.686808 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.686821 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c58063d-9ff4-43dd-9dec-17d14a541013-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.903704 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.904670 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgm9n" event={"ID":"4c58063d-9ff4-43dd-9dec-17d14a541013","Type":"ContainerDied","Data":"fe392cfe6bcc713bb2c243e257014faab71952c4dfc3764ba99f2ef25493b0fa"} Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.904991 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe392cfe6bcc713bb2c243e257014faab71952c4dfc3764ba99f2ef25493b0fa" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.983929 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj"] Dec 02 18:43:19 crc kubenswrapper[4878]: E1202 18:43:19.984936 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c58063d-9ff4-43dd-9dec-17d14a541013" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.984964 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c58063d-9ff4-43dd-9dec-17d14a541013" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 18:43:19 crc kubenswrapper[4878]: E1202 18:43:19.984981 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242dcc1d-40de-4f68-8d3f-3dbe4b813506" containerName="heat-engine" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.984988 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="242dcc1d-40de-4f68-8d3f-3dbe4b813506" containerName="heat-engine" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.985335 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c58063d-9ff4-43dd-9dec-17d14a541013" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.985364 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="242dcc1d-40de-4f68-8d3f-3dbe4b813506" containerName="heat-engine" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.986374 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.989452 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.989695 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.989899 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.990107 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.997977 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.998109 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j756s\" (UniqueName: \"kubernetes.io/projected/52939763-97c2-42f6-9aa4-56e99153e87f-kube-api-access-j756s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.998297 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:19 crc kubenswrapper[4878]: I1202 18:43:19.998885 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.000222 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj"] Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.100539 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.100624 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.100670 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j756s\" (UniqueName: \"kubernetes.io/projected/52939763-97c2-42f6-9aa4-56e99153e87f-kube-api-access-j756s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.100714 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.106011 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.106512 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.118945 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.122078 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j756s\" (UniqueName: \"kubernetes.io/projected/52939763-97c2-42f6-9aa4-56e99153e87f-kube-api-access-j756s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.310014 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:43:20 crc kubenswrapper[4878]: W1202 18:43:20.979134 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52939763_97c2_42f6_9aa4_56e99153e87f.slice/crio-d41d3310ff72494ad3e0318920806d3cbac55acf18ce15f13af78793e6d09ae8 WatchSource:0}: Error finding container d41d3310ff72494ad3e0318920806d3cbac55acf18ce15f13af78793e6d09ae8: Status 404 returned error can't find the container with id d41d3310ff72494ad3e0318920806d3cbac55acf18ce15f13af78793e6d09ae8 Dec 02 18:43:20 crc kubenswrapper[4878]: I1202 18:43:20.982586 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj"] Dec 02 18:43:21 crc kubenswrapper[4878]: I1202 18:43:21.946115 4878 generic.go:334] "Generic (PLEG): container finished" podID="e658259f-368d-4571-a236-2c9bd3c3d9c6" containerID="94a15ab3becf2b9eded108dfd0e7e355cceb6b5b42271f8118e7a77886aaa08e" exitCode=0 Dec 02 18:43:21 crc kubenswrapper[4878]: I1202 18:43:21.946214 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8jpmg" event={"ID":"e658259f-368d-4571-a236-2c9bd3c3d9c6","Type":"ContainerDied","Data":"94a15ab3becf2b9eded108dfd0e7e355cceb6b5b42271f8118e7a77886aaa08e"} Dec 02 18:43:21 crc kubenswrapper[4878]: I1202 18:43:21.948458 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" event={"ID":"52939763-97c2-42f6-9aa4-56e99153e87f","Type":"ContainerStarted","Data":"8cc5ff5bbd93908887375b06ecfebdee81dffc0206e3fe0ca9647613e27f1cbc"} Dec 02 18:43:21 crc kubenswrapper[4878]: I1202 18:43:21.948503 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" event={"ID":"52939763-97c2-42f6-9aa4-56e99153e87f","Type":"ContainerStarted","Data":"d41d3310ff72494ad3e0318920806d3cbac55acf18ce15f13af78793e6d09ae8"} Dec 02 18:43:22 crc kubenswrapper[4878]: I1202 18:43:22.016188 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" podStartSLOduration=2.611986995 podStartE2EDuration="3.016159569s" podCreationTimestamp="2025-12-02 18:43:19 +0000 UTC" firstStartedPulling="2025-12-02 18:43:20.983049587 +0000 UTC m=+1710.672668468" lastFinishedPulling="2025-12-02 18:43:21.387222121 +0000 UTC m=+1711.076841042" observedRunningTime="2025-12-02 18:43:21.984651079 +0000 UTC m=+1711.674269980" watchObservedRunningTime="2025-12-02 18:43:22.016159569 +0000 UTC m=+1711.705778450" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.452644 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.602822 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-config-data\") pod \"e658259f-368d-4571-a236-2c9bd3c3d9c6\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.603050 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-scripts\") pod \"e658259f-368d-4571-a236-2c9bd3c3d9c6\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.603128 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-combined-ca-bundle\") pod \"e658259f-368d-4571-a236-2c9bd3c3d9c6\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.603372 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/e658259f-368d-4571-a236-2c9bd3c3d9c6-kube-api-access-ncm6v\") pod \"e658259f-368d-4571-a236-2c9bd3c3d9c6\" (UID: \"e658259f-368d-4571-a236-2c9bd3c3d9c6\") " Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.610603 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e658259f-368d-4571-a236-2c9bd3c3d9c6-kube-api-access-ncm6v" (OuterVolumeSpecName: "kube-api-access-ncm6v") pod "e658259f-368d-4571-a236-2c9bd3c3d9c6" (UID: "e658259f-368d-4571-a236-2c9bd3c3d9c6"). InnerVolumeSpecName "kube-api-access-ncm6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.611631 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-scripts" (OuterVolumeSpecName: "scripts") pod "e658259f-368d-4571-a236-2c9bd3c3d9c6" (UID: "e658259f-368d-4571-a236-2c9bd3c3d9c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.657669 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e658259f-368d-4571-a236-2c9bd3c3d9c6" (UID: "e658259f-368d-4571-a236-2c9bd3c3d9c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.659508 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-config-data" (OuterVolumeSpecName: "config-data") pod "e658259f-368d-4571-a236-2c9bd3c3d9c6" (UID: "e658259f-368d-4571-a236-2c9bd3c3d9c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.706571 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/e658259f-368d-4571-a236-2c9bd3c3d9c6-kube-api-access-ncm6v\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.706598 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.706608 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.706617 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e658259f-368d-4571-a236-2c9bd3c3d9c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.979277 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8jpmg" event={"ID":"e658259f-368d-4571-a236-2c9bd3c3d9c6","Type":"ContainerDied","Data":"ec28629f7a5d8ff3545d688a3c2b2e10d3ba5d9a11d39bf76f02e6bd6f193af8"} Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.979339 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec28629f7a5d8ff3545d688a3c2b2e10d3ba5d9a11d39bf76f02e6bd6f193af8" Dec 02 18:43:23 crc kubenswrapper[4878]: I1202 18:43:23.979382 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8jpmg" Dec 02 18:43:24 crc kubenswrapper[4878]: I1202 18:43:24.937527 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:43:24 crc kubenswrapper[4878]: E1202 18:43:24.937980 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:43:26 crc kubenswrapper[4878]: I1202 18:43:26.227571 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 18:43:26 crc kubenswrapper[4878]: I1202 18:43:26.228336 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-api" containerID="cri-o://eb3c639a21dbe6bd8568cabf49415e365cb5dbcd4fbbbc27cb68186ba16f1fc3" gracePeriod=30 Dec 02 18:43:26 crc kubenswrapper[4878]: I1202 18:43:26.228454 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-listener" containerID="cri-o://b8114f01b1b2e1e08f3a2b037fb46c96a55a24484d3bdd39991c68eef518f8db" gracePeriod=30 Dec 02 18:43:26 crc kubenswrapper[4878]: I1202 18:43:26.228486 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-evaluator" containerID="cri-o://2173816ea965e2df9673508cf3b7318be20d81bcfaacff2d38b83bb175088dcf" gracePeriod=30 Dec 02 18:43:26 crc kubenswrapper[4878]: I1202 18:43:26.228506 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-notifier" containerID="cri-o://b56f8908b788f3b4107f06976ae35147a9be3aa0d598a261f4a3013513d7fc18" gracePeriod=30 Dec 02 18:43:27 crc kubenswrapper[4878]: I1202 18:43:27.021781 4878 generic.go:334] "Generic (PLEG): container finished" podID="ea2cae47-1c1a-408f-b391-3641eae02402" containerID="2173816ea965e2df9673508cf3b7318be20d81bcfaacff2d38b83bb175088dcf" exitCode=0 Dec 02 18:43:27 crc kubenswrapper[4878]: I1202 18:43:27.022002 4878 generic.go:334] "Generic (PLEG): container finished" podID="ea2cae47-1c1a-408f-b391-3641eae02402" containerID="eb3c639a21dbe6bd8568cabf49415e365cb5dbcd4fbbbc27cb68186ba16f1fc3" exitCode=0 Dec 02 18:43:27 crc kubenswrapper[4878]: I1202 18:43:27.021855 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerDied","Data":"2173816ea965e2df9673508cf3b7318be20d81bcfaacff2d38b83bb175088dcf"} Dec 02 18:43:27 crc kubenswrapper[4878]: I1202 18:43:27.022044 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerDied","Data":"eb3c639a21dbe6bd8568cabf49415e365cb5dbcd4fbbbc27cb68186ba16f1fc3"} Dec 02 18:43:31 crc kubenswrapper[4878]: E1202 18:43:31.858351 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea2cae47_1c1a_408f_b391_3641eae02402.slice/crio-conmon-b56f8908b788f3b4107f06976ae35147a9be3aa0d598a261f4a3013513d7fc18.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:43:31 crc kubenswrapper[4878]: E1202 18:43:31.858752 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea2cae47_1c1a_408f_b391_3641eae02402.slice/crio-conmon-b56f8908b788f3b4107f06976ae35147a9be3aa0d598a261f4a3013513d7fc18.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.105495 4878 generic.go:334] "Generic (PLEG): container finished" podID="ea2cae47-1c1a-408f-b391-3641eae02402" containerID="b8114f01b1b2e1e08f3a2b037fb46c96a55a24484d3bdd39991c68eef518f8db" exitCode=0 Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.105844 4878 generic.go:334] "Generic (PLEG): container finished" podID="ea2cae47-1c1a-408f-b391-3641eae02402" containerID="b56f8908b788f3b4107f06976ae35147a9be3aa0d598a261f4a3013513d7fc18" exitCode=0 Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.105561 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerDied","Data":"b8114f01b1b2e1e08f3a2b037fb46c96a55a24484d3bdd39991c68eef518f8db"} Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.105888 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerDied","Data":"b56f8908b788f3b4107f06976ae35147a9be3aa0d598a261f4a3013513d7fc18"} Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.637399 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.818313 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-public-tls-certs\") pod \"ea2cae47-1c1a-408f-b391-3641eae02402\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.818416 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbxk\" (UniqueName: \"kubernetes.io/projected/ea2cae47-1c1a-408f-b391-3641eae02402-kube-api-access-xfbxk\") pod \"ea2cae47-1c1a-408f-b391-3641eae02402\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.818603 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-combined-ca-bundle\") pod \"ea2cae47-1c1a-408f-b391-3641eae02402\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.818720 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-config-data\") pod \"ea2cae47-1c1a-408f-b391-3641eae02402\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.818792 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-internal-tls-certs\") pod \"ea2cae47-1c1a-408f-b391-3641eae02402\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.818891 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-scripts\") pod \"ea2cae47-1c1a-408f-b391-3641eae02402\" (UID: \"ea2cae47-1c1a-408f-b391-3641eae02402\") " Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.835638 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-scripts" (OuterVolumeSpecName: "scripts") pod "ea2cae47-1c1a-408f-b391-3641eae02402" (UID: "ea2cae47-1c1a-408f-b391-3641eae02402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.840752 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea2cae47-1c1a-408f-b391-3641eae02402-kube-api-access-xfbxk" (OuterVolumeSpecName: "kube-api-access-xfbxk") pod "ea2cae47-1c1a-408f-b391-3641eae02402" (UID: "ea2cae47-1c1a-408f-b391-3641eae02402"). InnerVolumeSpecName "kube-api-access-xfbxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.897187 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ea2cae47-1c1a-408f-b391-3641eae02402" (UID: "ea2cae47-1c1a-408f-b391-3641eae02402"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.922677 4878 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.923007 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbxk\" (UniqueName: \"kubernetes.io/projected/ea2cae47-1c1a-408f-b391-3641eae02402-kube-api-access-xfbxk\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.923164 4878 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.933012 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ea2cae47-1c1a-408f-b391-3641eae02402" (UID: "ea2cae47-1c1a-408f-b391-3641eae02402"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.962173 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-config-data" (OuterVolumeSpecName: "config-data") pod "ea2cae47-1c1a-408f-b391-3641eae02402" (UID: "ea2cae47-1c1a-408f-b391-3641eae02402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:32 crc kubenswrapper[4878]: I1202 18:43:32.998473 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea2cae47-1c1a-408f-b391-3641eae02402" (UID: "ea2cae47-1c1a-408f-b391-3641eae02402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.025346 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.025546 4878 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.025561 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2cae47-1c1a-408f-b391-3641eae02402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.123943 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ea2cae47-1c1a-408f-b391-3641eae02402","Type":"ContainerDied","Data":"15a37a81b0defc0ef48fb47014118e14b5981d42a04b9c61610c825ffcd9d221"} Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.124013 4878 scope.go:117] "RemoveContainer" containerID="b8114f01b1b2e1e08f3a2b037fb46c96a55a24484d3bdd39991c68eef518f8db" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.124338 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.164703 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.171112 4878 scope.go:117] "RemoveContainer" containerID="b56f8908b788f3b4107f06976ae35147a9be3aa0d598a261f4a3013513d7fc18" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.186898 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.211415 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 02 18:43:33 crc kubenswrapper[4878]: E1202 18:43:33.212698 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-evaluator" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.212868 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-evaluator" Dec 02 18:43:33 crc kubenswrapper[4878]: E1202 18:43:33.213057 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-notifier" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.213208 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-notifier" Dec 02 18:43:33 crc kubenswrapper[4878]: E1202 18:43:33.213306 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-api" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.213380 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-api" Dec 02 18:43:33 crc kubenswrapper[4878]: E1202 18:43:33.213476 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-listener" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.213560 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-listener" Dec 02 18:43:33 crc kubenswrapper[4878]: E1202 18:43:33.213629 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e658259f-368d-4571-a236-2c9bd3c3d9c6" containerName="aodh-db-sync" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.213681 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e658259f-368d-4571-a236-2c9bd3c3d9c6" containerName="aodh-db-sync" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.213973 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-listener" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.214048 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-evaluator" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.214110 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e658259f-368d-4571-a236-2c9bd3c3d9c6" containerName="aodh-db-sync" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.214185 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-api" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.214275 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" containerName="aodh-notifier" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.216649 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.218568 4878 scope.go:117] "RemoveContainer" containerID="2173816ea965e2df9673508cf3b7318be20d81bcfaacff2d38b83bb175088dcf" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.220341 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.220470 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-f79q6" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.220655 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.220868 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.231019 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.233029 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.277502 4878 scope.go:117] "RemoveContainer" containerID="eb3c639a21dbe6bd8568cabf49415e365cb5dbcd4fbbbc27cb68186ba16f1fc3" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.333227 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-public-tls-certs\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.333326 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-scripts\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.333349 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-config-data\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.333390 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs6kl\" (UniqueName: \"kubernetes.io/projected/09c21be8-b654-42b3-b5da-59d1afb0054b-kube-api-access-bs6kl\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.333512 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.333565 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-internal-tls-certs\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.436499 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.436556 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-internal-tls-certs\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.436644 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-public-tls-certs\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.436680 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-scripts\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.436699 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-config-data\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.436736 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs6kl\" (UniqueName: \"kubernetes.io/projected/09c21be8-b654-42b3-b5da-59d1afb0054b-kube-api-access-bs6kl\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.441008 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.447745 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-internal-tls-certs\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.447926 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-scripts\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.448253 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-config-data\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.448788 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09c21be8-b654-42b3-b5da-59d1afb0054b-public-tls-certs\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.459019 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs6kl\" (UniqueName: \"kubernetes.io/projected/09c21be8-b654-42b3-b5da-59d1afb0054b-kube-api-access-bs6kl\") pod \"aodh-0\" (UID: \"09c21be8-b654-42b3-b5da-59d1afb0054b\") " pod="openstack/aodh-0" Dec 02 18:43:33 crc kubenswrapper[4878]: I1202 18:43:33.563854 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 18:43:34 crc kubenswrapper[4878]: I1202 18:43:34.053512 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 18:43:34 crc kubenswrapper[4878]: I1202 18:43:34.142282 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"09c21be8-b654-42b3-b5da-59d1afb0054b","Type":"ContainerStarted","Data":"6322b5b4000bd312926581469a5f3c21bc8591a4dd7b1ee94ebbe84383b99776"} Dec 02 18:43:34 crc kubenswrapper[4878]: I1202 18:43:34.955657 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea2cae47-1c1a-408f-b391-3641eae02402" path="/var/lib/kubelet/pods/ea2cae47-1c1a-408f-b391-3641eae02402/volumes" Dec 02 18:43:35 crc kubenswrapper[4878]: I1202 18:43:35.158102 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"09c21be8-b654-42b3-b5da-59d1afb0054b","Type":"ContainerStarted","Data":"d89aedc29344e50afaa47151a062ad46b70f0810b0fdaf53f66719d2f7260cf7"} Dec 02 18:43:36 crc kubenswrapper[4878]: I1202 18:43:36.183134 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"09c21be8-b654-42b3-b5da-59d1afb0054b","Type":"ContainerStarted","Data":"333647c6cab29e241b991fc41c2d450d23c48855e910e8708a2e6e4728f374ad"} Dec 02 18:43:37 crc kubenswrapper[4878]: I1202 18:43:37.199839 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"09c21be8-b654-42b3-b5da-59d1afb0054b","Type":"ContainerStarted","Data":"fd32871677234a0dd33656c7a569a34431899ccf25b7346c598a6486839aad16"} Dec 02 18:43:37 crc kubenswrapper[4878]: I1202 18:43:37.938355 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:43:37 crc kubenswrapper[4878]: E1202 18:43:37.939092 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:43:38 crc kubenswrapper[4878]: I1202 18:43:38.221414 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"09c21be8-b654-42b3-b5da-59d1afb0054b","Type":"ContainerStarted","Data":"69e9d02db46d31eed76f2f99301023d4b1bfd7b8e27e423e67f2f9f47ba56b4f"} Dec 02 18:43:38 crc kubenswrapper[4878]: I1202 18:43:38.272199 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.541046105 podStartE2EDuration="5.272169082s" podCreationTimestamp="2025-12-02 18:43:33 +0000 UTC" firstStartedPulling="2025-12-02 18:43:34.061527056 +0000 UTC m=+1723.751145937" lastFinishedPulling="2025-12-02 18:43:37.792650043 +0000 UTC m=+1727.482268914" observedRunningTime="2025-12-02 18:43:38.249828557 +0000 UTC m=+1727.939447448" watchObservedRunningTime="2025-12-02 18:43:38.272169082 +0000 UTC m=+1727.961787983" Dec 02 18:43:51 crc kubenswrapper[4878]: I1202 18:43:51.938523 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:43:51 crc kubenswrapper[4878]: E1202 18:43:51.939401 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:43:59 crc kubenswrapper[4878]: I1202 18:43:59.714134 4878 scope.go:117] "RemoveContainer" containerID="2d8668b3bbd2fa38c99cd63d64d7d8f0f8e78babae8c5fa00092354dc02cb0ef" Dec 02 18:43:59 crc kubenswrapper[4878]: I1202 18:43:59.775681 4878 scope.go:117] "RemoveContainer" containerID="97376565e713d1ae53c4c2732e707a792e928bf838763d84edd735934089b938" Dec 02 18:44:06 crc kubenswrapper[4878]: I1202 18:44:06.939603 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:44:06 crc kubenswrapper[4878]: E1202 18:44:06.941861 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:44:19 crc kubenswrapper[4878]: I1202 18:44:19.938629 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:44:19 crc kubenswrapper[4878]: E1202 18:44:19.939999 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:44:33 crc kubenswrapper[4878]: I1202 18:44:33.938411 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:44:33 crc kubenswrapper[4878]: E1202 18:44:33.940588 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:44:45 crc kubenswrapper[4878]: I1202 18:44:45.938840 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:44:45 crc kubenswrapper[4878]: E1202 18:44:45.941501 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:44:57 crc kubenswrapper[4878]: I1202 18:44:57.938657 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:44:57 crc kubenswrapper[4878]: E1202 18:44:57.939983 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:44:59 crc kubenswrapper[4878]: I1202 18:44:59.955109 4878 scope.go:117] "RemoveContainer" containerID="9fa544bb0c693b2383d85f2095276439c45d6bd225440b4e7400cd6de265e2dc" Dec 02 18:44:59 crc kubenswrapper[4878]: I1202 18:44:59.994286 4878 scope.go:117] "RemoveContainer" containerID="a05ef375f6e7ba1695de0b850ff826d5df295ad131a855ac7caeb2ef972fb5b1" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.033251 4878 scope.go:117] "RemoveContainer" containerID="28035db2e638b03df0839122250e1679c0fb85eb591e3c7426c594ac0ba68a3a" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.083757 4878 scope.go:117] "RemoveContainer" containerID="d41dad766c59745a147f4023eefa4a896098b8792ec0eb5d76efa27e7f4db6e3" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.113728 4878 scope.go:117] "RemoveContainer" containerID="0e484753605b4722b782d98e2b810d1219c36e2dc5f8eff75ce4626630c57bb5" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.169937 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq"] Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.173568 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.177742 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.179359 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.193925 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq"] Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.219650 4878 scope.go:117] "RemoveContainer" containerID="1a5c679d7c4b99ed2be9d4c7b8d5ce4065411b807e1154c73d224619da6c4f36" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.250318 4878 scope.go:117] "RemoveContainer" containerID="3719d30fd4b727fd8781ab06800fe1878b068f02080b2ae83116f881d227dc6d" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.262809 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-config-volume\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.263073 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-secret-volume\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.263128 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw85l\" (UniqueName: \"kubernetes.io/projected/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-kube-api-access-jw85l\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.366127 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-secret-volume\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.366186 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw85l\" (UniqueName: \"kubernetes.io/projected/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-kube-api-access-jw85l\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.366259 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-config-volume\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.367551 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-config-volume\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.372759 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-secret-volume\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.385502 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw85l\" (UniqueName: \"kubernetes.io/projected/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-kube-api-access-jw85l\") pod \"collect-profiles-29411685-jtggq\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:00 crc kubenswrapper[4878]: I1202 18:45:00.545772 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:01 crc kubenswrapper[4878]: W1202 18:45:01.038512 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166b1c46_c3d9_42b9_a1f3_6925d96a6a03.slice/crio-676f5592893facca1c58c455b40d7497427a3e0da3a2b6d6fc14cfa52dfae25c WatchSource:0}: Error finding container 676f5592893facca1c58c455b40d7497427a3e0da3a2b6d6fc14cfa52dfae25c: Status 404 returned error can't find the container with id 676f5592893facca1c58c455b40d7497427a3e0da3a2b6d6fc14cfa52dfae25c Dec 02 18:45:01 crc kubenswrapper[4878]: I1202 18:45:01.039023 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq"] Dec 02 18:45:01 crc kubenswrapper[4878]: I1202 18:45:01.544003 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" event={"ID":"166b1c46-c3d9-42b9-a1f3-6925d96a6a03","Type":"ContainerStarted","Data":"9f9f3d63a16ce958842e1b771f17538e6a32ebe40dd005098b0f8d9a71db00d9"} Dec 02 18:45:01 crc kubenswrapper[4878]: I1202 18:45:01.544423 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" event={"ID":"166b1c46-c3d9-42b9-a1f3-6925d96a6a03","Type":"ContainerStarted","Data":"676f5592893facca1c58c455b40d7497427a3e0da3a2b6d6fc14cfa52dfae25c"} Dec 02 18:45:01 crc kubenswrapper[4878]: I1202 18:45:01.600365 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" podStartSLOduration=1.600342839 podStartE2EDuration="1.600342839s" podCreationTimestamp="2025-12-02 18:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 18:45:01.582973148 +0000 UTC m=+1811.272592049" watchObservedRunningTime="2025-12-02 18:45:01.600342839 +0000 UTC m=+1811.289961720" Dec 02 18:45:01 crc kubenswrapper[4878]: E1202 18:45:01.958523 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166b1c46_c3d9_42b9_a1f3_6925d96a6a03.slice/crio-9f9f3d63a16ce958842e1b771f17538e6a32ebe40dd005098b0f8d9a71db00d9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166b1c46_c3d9_42b9_a1f3_6925d96a6a03.slice/crio-conmon-9f9f3d63a16ce958842e1b771f17538e6a32ebe40dd005098b0f8d9a71db00d9.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:45:02 crc kubenswrapper[4878]: I1202 18:45:02.558666 4878 generic.go:334] "Generic (PLEG): container finished" podID="166b1c46-c3d9-42b9-a1f3-6925d96a6a03" containerID="9f9f3d63a16ce958842e1b771f17538e6a32ebe40dd005098b0f8d9a71db00d9" exitCode=0 Dec 02 18:45:02 crc kubenswrapper[4878]: I1202 18:45:02.558720 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" event={"ID":"166b1c46-c3d9-42b9-a1f3-6925d96a6a03","Type":"ContainerDied","Data":"9f9f3d63a16ce958842e1b771f17538e6a32ebe40dd005098b0f8d9a71db00d9"} Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.039271 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.178629 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-config-volume\") pod \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.178771 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw85l\" (UniqueName: \"kubernetes.io/projected/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-kube-api-access-jw85l\") pod \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.178987 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-secret-volume\") pod \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\" (UID: \"166b1c46-c3d9-42b9-a1f3-6925d96a6a03\") " Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.179345 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-config-volume" (OuterVolumeSpecName: "config-volume") pod "166b1c46-c3d9-42b9-a1f3-6925d96a6a03" (UID: "166b1c46-c3d9-42b9-a1f3-6925d96a6a03"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.179920 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.185279 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "166b1c46-c3d9-42b9-a1f3-6925d96a6a03" (UID: "166b1c46-c3d9-42b9-a1f3-6925d96a6a03"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.187549 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-kube-api-access-jw85l" (OuterVolumeSpecName: "kube-api-access-jw85l") pod "166b1c46-c3d9-42b9-a1f3-6925d96a6a03" (UID: "166b1c46-c3d9-42b9-a1f3-6925d96a6a03"). InnerVolumeSpecName "kube-api-access-jw85l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.283079 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.283117 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw85l\" (UniqueName: \"kubernetes.io/projected/166b1c46-c3d9-42b9-a1f3-6925d96a6a03-kube-api-access-jw85l\") on node \"crc\" DevicePath \"\"" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.586374 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" event={"ID":"166b1c46-c3d9-42b9-a1f3-6925d96a6a03","Type":"ContainerDied","Data":"676f5592893facca1c58c455b40d7497427a3e0da3a2b6d6fc14cfa52dfae25c"} Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.586424 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="676f5592893facca1c58c455b40d7497427a3e0da3a2b6d6fc14cfa52dfae25c" Dec 02 18:45:04 crc kubenswrapper[4878]: I1202 18:45:04.586500 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq" Dec 02 18:45:11 crc kubenswrapper[4878]: I1202 18:45:11.938793 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:45:11 crc kubenswrapper[4878]: E1202 18:45:11.941463 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:45:26 crc kubenswrapper[4878]: I1202 18:45:26.937975 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:45:26 crc kubenswrapper[4878]: E1202 18:45:26.938767 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:45:40 crc kubenswrapper[4878]: I1202 18:45:40.947523 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:45:40 crc kubenswrapper[4878]: E1202 18:45:40.948409 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:45:53 crc kubenswrapper[4878]: I1202 18:45:53.937924 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:45:53 crc kubenswrapper[4878]: E1202 18:45:53.938675 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:46:00 crc kubenswrapper[4878]: I1202 18:46:00.348478 4878 scope.go:117] "RemoveContainer" containerID="fee7ab1681302243f395cd41adf57ea38404f7fed28ee9617864ad77e3739d16" Dec 02 18:46:00 crc kubenswrapper[4878]: I1202 18:46:00.411422 4878 scope.go:117] "RemoveContainer" containerID="911e632cfb7f741304ab68ad63b33cc66ce688ea853af4754667e1002526973d" Dec 02 18:46:00 crc kubenswrapper[4878]: I1202 18:46:00.472088 4878 scope.go:117] "RemoveContainer" containerID="a0491fee9a6ebce2288eb3b26e3f45bae1b1c0d466575cf3f6ff734a15620cb5" Dec 02 18:46:00 crc kubenswrapper[4878]: I1202 18:46:00.498422 4878 scope.go:117] "RemoveContainer" containerID="4679fe50d42693310dc07fb5b64f2f0e635bbcc0d252859bd49012429ebd6319" Dec 02 18:46:00 crc kubenswrapper[4878]: I1202 18:46:00.542364 4878 scope.go:117] "RemoveContainer" containerID="953bf45cca88eb554756200cbac15a3697e217fb23f916b13a9f08c6d0642291" Dec 02 18:46:00 crc kubenswrapper[4878]: I1202 18:46:00.578661 4878 scope.go:117] "RemoveContainer" containerID="106302f7bf29364375d8a0797c569137b44908aebecd14827d3d15ab5f762391" Dec 02 18:46:04 crc kubenswrapper[4878]: I1202 18:46:04.938756 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:46:04 crc kubenswrapper[4878]: E1202 18:46:04.939741 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:46:17 crc kubenswrapper[4878]: I1202 18:46:17.938694 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:46:17 crc kubenswrapper[4878]: E1202 18:46:17.939698 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:46:24 crc kubenswrapper[4878]: I1202 18:46:24.046480 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-xzscw"] Dec 02 18:46:24 crc kubenswrapper[4878]: I1202 18:46:24.061714 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-95c8-account-create-update-szl82"] Dec 02 18:46:24 crc kubenswrapper[4878]: I1202 18:46:24.073738 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-xzscw"] Dec 02 18:46:24 crc kubenswrapper[4878]: I1202 18:46:24.086705 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-95c8-account-create-update-szl82"] Dec 02 18:46:24 crc kubenswrapper[4878]: I1202 18:46:24.971274 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41728d08-e232-4713-ba93-162894335f4c" path="/var/lib/kubelet/pods/41728d08-e232-4713-ba93-162894335f4c/volumes" Dec 02 18:46:24 crc kubenswrapper[4878]: I1202 18:46:24.972550 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73844c1f-2d85-4d85-889f-90392030c02d" path="/var/lib/kubelet/pods/73844c1f-2d85-4d85-889f-90392030c02d/volumes" Dec 02 18:46:28 crc kubenswrapper[4878]: I1202 18:46:28.057494 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9766-account-create-update-cs8kz"] Dec 02 18:46:28 crc kubenswrapper[4878]: I1202 18:46:28.072658 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9766-account-create-update-cs8kz"] Dec 02 18:46:28 crc kubenswrapper[4878]: I1202 18:46:28.966302 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a2e351-13b3-47a9-b084-b0aa69b245ca" path="/var/lib/kubelet/pods/84a2e351-13b3-47a9-b084-b0aa69b245ca/volumes" Dec 02 18:46:29 crc kubenswrapper[4878]: I1202 18:46:29.066961 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qswsn"] Dec 02 18:46:29 crc kubenswrapper[4878]: I1202 18:46:29.089213 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3140-account-create-update-z8jqc"] Dec 02 18:46:29 crc kubenswrapper[4878]: I1202 18:46:29.106843 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dkzxj"] Dec 02 18:46:29 crc kubenswrapper[4878]: I1202 18:46:29.121979 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qswsn"] Dec 02 18:46:29 crc kubenswrapper[4878]: I1202 18:46:29.135551 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3140-account-create-update-z8jqc"] Dec 02 18:46:29 crc kubenswrapper[4878]: I1202 18:46:29.148384 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dkzxj"] Dec 02 18:46:29 crc kubenswrapper[4878]: I1202 18:46:29.938038 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:46:29 crc kubenswrapper[4878]: E1202 18:46:29.938694 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:46:30 crc kubenswrapper[4878]: I1202 18:46:30.066571 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-p6mn2"] Dec 02 18:46:30 crc kubenswrapper[4878]: I1202 18:46:30.080796 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-p6mn2"] Dec 02 18:46:30 crc kubenswrapper[4878]: I1202 18:46:30.962201 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18318a2e-0ad8-49cc-bb0b-30a5fbc7604c" path="/var/lib/kubelet/pods/18318a2e-0ad8-49cc-bb0b-30a5fbc7604c/volumes" Dec 02 18:46:30 crc kubenswrapper[4878]: I1202 18:46:30.965823 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4a31c7-8336-4fc3-b5f9-7614e75b3c65" path="/var/lib/kubelet/pods/1c4a31c7-8336-4fc3-b5f9-7614e75b3c65/volumes" Dec 02 18:46:30 crc kubenswrapper[4878]: I1202 18:46:30.969261 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae" path="/var/lib/kubelet/pods/aed21e54-0f8b-45f1-b2c8-42ba3a6b78ae/volumes" Dec 02 18:46:30 crc kubenswrapper[4878]: I1202 18:46:30.971059 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ed7121-1f6d-4f72-81a7-a25c9f98304b" path="/var/lib/kubelet/pods/f3ed7121-1f6d-4f72-81a7-a25c9f98304b/volumes" Dec 02 18:46:31 crc kubenswrapper[4878]: I1202 18:46:31.072180 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-397e-account-create-update-88k7b"] Dec 02 18:46:31 crc kubenswrapper[4878]: I1202 18:46:31.092278 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-397e-account-create-update-88k7b"] Dec 02 18:46:32 crc kubenswrapper[4878]: I1202 18:46:32.956674 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8" path="/var/lib/kubelet/pods/f6d2a85e-fae0-4d6c-b2c3-068cbb5ab6b8/volumes" Dec 02 18:46:42 crc kubenswrapper[4878]: I1202 18:46:42.993661 4878 generic.go:334] "Generic (PLEG): container finished" podID="52939763-97c2-42f6-9aa4-56e99153e87f" containerID="8cc5ff5bbd93908887375b06ecfebdee81dffc0206e3fe0ca9647613e27f1cbc" exitCode=0 Dec 02 18:46:42 crc kubenswrapper[4878]: I1202 18:46:42.993781 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" event={"ID":"52939763-97c2-42f6-9aa4-56e99153e87f","Type":"ContainerDied","Data":"8cc5ff5bbd93908887375b06ecfebdee81dffc0206e3fe0ca9647613e27f1cbc"} Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.045475 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ll544"] Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.063423 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-e07d-account-create-update-7dnbk"] Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.076410 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-ll544"] Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.089472 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-e07d-account-create-update-7dnbk"] Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.594871 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.717466 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j756s\" (UniqueName: \"kubernetes.io/projected/52939763-97c2-42f6-9aa4-56e99153e87f-kube-api-access-j756s\") pod \"52939763-97c2-42f6-9aa4-56e99153e87f\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.717664 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-inventory\") pod \"52939763-97c2-42f6-9aa4-56e99153e87f\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.717990 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-ssh-key\") pod \"52939763-97c2-42f6-9aa4-56e99153e87f\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.718066 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-bootstrap-combined-ca-bundle\") pod \"52939763-97c2-42f6-9aa4-56e99153e87f\" (UID: \"52939763-97c2-42f6-9aa4-56e99153e87f\") " Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.724903 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "52939763-97c2-42f6-9aa4-56e99153e87f" (UID: "52939763-97c2-42f6-9aa4-56e99153e87f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.727284 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52939763-97c2-42f6-9aa4-56e99153e87f-kube-api-access-j756s" (OuterVolumeSpecName: "kube-api-access-j756s") pod "52939763-97c2-42f6-9aa4-56e99153e87f" (UID: "52939763-97c2-42f6-9aa4-56e99153e87f"). InnerVolumeSpecName "kube-api-access-j756s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.758519 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-inventory" (OuterVolumeSpecName: "inventory") pod "52939763-97c2-42f6-9aa4-56e99153e87f" (UID: "52939763-97c2-42f6-9aa4-56e99153e87f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.760952 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52939763-97c2-42f6-9aa4-56e99153e87f" (UID: "52939763-97c2-42f6-9aa4-56e99153e87f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.821331 4878 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.821362 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j756s\" (UniqueName: \"kubernetes.io/projected/52939763-97c2-42f6-9aa4-56e99153e87f-kube-api-access-j756s\") on node \"crc\" DevicePath \"\"" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.821402 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.821411 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52939763-97c2-42f6-9aa4-56e99153e87f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.937681 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:46:44 crc kubenswrapper[4878]: E1202 18:46:44.938424 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.954354 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d3c516-e91a-49fc-b75a-f425cd4ccd5d" path="/var/lib/kubelet/pods/b7d3c516-e91a-49fc-b75a-f425cd4ccd5d/volumes" Dec 02 18:46:44 crc kubenswrapper[4878]: I1202 18:46:44.955307 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9feb9b1-2723-4414-bf7f-747c0295cb66" path="/var/lib/kubelet/pods/b9feb9b1-2723-4414-bf7f-747c0295cb66/volumes" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.028201 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" event={"ID":"52939763-97c2-42f6-9aa4-56e99153e87f","Type":"ContainerDied","Data":"d41d3310ff72494ad3e0318920806d3cbac55acf18ce15f13af78793e6d09ae8"} Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.028344 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d41d3310ff72494ad3e0318920806d3cbac55acf18ce15f13af78793e6d09ae8" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.028342 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.125301 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj"] Dec 02 18:46:45 crc kubenswrapper[4878]: E1202 18:46:45.126379 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166b1c46-c3d9-42b9-a1f3-6925d96a6a03" containerName="collect-profiles" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.126405 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="166b1c46-c3d9-42b9-a1f3-6925d96a6a03" containerName="collect-profiles" Dec 02 18:46:45 crc kubenswrapper[4878]: E1202 18:46:45.126470 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52939763-97c2-42f6-9aa4-56e99153e87f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.126486 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="52939763-97c2-42f6-9aa4-56e99153e87f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.126868 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="52939763-97c2-42f6-9aa4-56e99153e87f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.126908 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="166b1c46-c3d9-42b9-a1f3-6925d96a6a03" containerName="collect-profiles" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.128147 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.131907 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.132359 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.132569 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.133108 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.150336 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj"] Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.236529 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.236720 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvwc\" (UniqueName: \"kubernetes.io/projected/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-kube-api-access-whvwc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.236941 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.340754 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.340893 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvwc\" (UniqueName: \"kubernetes.io/projected/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-kube-api-access-whvwc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.341090 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.348435 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.362855 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.377324 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvwc\" (UniqueName: \"kubernetes.io/projected/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-kube-api-access-whvwc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-l59tj\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:45 crc kubenswrapper[4878]: I1202 18:46:45.456810 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:46:46 crc kubenswrapper[4878]: I1202 18:46:46.087262 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj"] Dec 02 18:46:47 crc kubenswrapper[4878]: I1202 18:46:47.057401 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" event={"ID":"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86","Type":"ContainerStarted","Data":"ba1b30d1fff2ab4330994337d9c61128a286262c48d5df81190e5761bf3f88d2"} Dec 02 18:46:47 crc kubenswrapper[4878]: I1202 18:46:47.057810 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" event={"ID":"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86","Type":"ContainerStarted","Data":"01fb9ab8429048443a164055df0e98111a88e29e2cd3602ec39fb7355405308b"} Dec 02 18:46:47 crc kubenswrapper[4878]: I1202 18:46:47.091947 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" podStartSLOduration=1.6176759 podStartE2EDuration="2.091877975s" podCreationTimestamp="2025-12-02 18:46:45 +0000 UTC" firstStartedPulling="2025-12-02 18:46:46.088476825 +0000 UTC m=+1915.778095706" lastFinishedPulling="2025-12-02 18:46:46.5626789 +0000 UTC m=+1916.252297781" observedRunningTime="2025-12-02 18:46:47.076726273 +0000 UTC m=+1916.766345154" watchObservedRunningTime="2025-12-02 18:46:47.091877975 +0000 UTC m=+1916.781496916" Dec 02 18:46:55 crc kubenswrapper[4878]: I1202 18:46:55.940035 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:46:55 crc kubenswrapper[4878]: E1202 18:46:55.941201 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:46:58 crc kubenswrapper[4878]: I1202 18:46:58.052306 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a601-account-create-update-2kkrr"] Dec 02 18:46:58 crc kubenswrapper[4878]: I1202 18:46:58.066708 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a601-account-create-update-2kkrr"] Dec 02 18:46:58 crc kubenswrapper[4878]: I1202 18:46:58.956890 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90576306-b86d-4017-84fb-19c00f9630a0" path="/var/lib/kubelet/pods/90576306-b86d-4017-84fb-19c00f9630a0/volumes" Dec 02 18:47:00 crc kubenswrapper[4878]: I1202 18:47:00.714488 4878 scope.go:117] "RemoveContainer" containerID="15ec9813594ccb589e47e88428083dd7f55bc204c8b67348edf9736bd1a00151" Dec 02 18:47:00 crc kubenswrapper[4878]: I1202 18:47:00.745986 4878 scope.go:117] "RemoveContainer" containerID="9fd93089944b1a43ccbdceaa1e6df6f656fccf71c48ad56acd5cd46040692b7a" Dec 02 18:47:00 crc kubenswrapper[4878]: I1202 18:47:00.783891 4878 scope.go:117] "RemoveContainer" containerID="121a8d3c775386afc0556a9010b5d70a8e787bf9cf368395e979c3b2e5265e02" Dec 02 18:47:00 crc kubenswrapper[4878]: I1202 18:47:00.843251 4878 scope.go:117] "RemoveContainer" containerID="3a652324c75508125ebaf6e7c282eba862f0782adf1b28f4adfaa3d2c2bf15bc" Dec 02 18:47:00 crc kubenswrapper[4878]: I1202 18:47:00.893732 4878 scope.go:117] "RemoveContainer" containerID="d1b26df3c986f36a878e7d14ccf9008789835efafa93c0da17db2336fbe9f0e9" Dec 02 18:47:00 crc kubenswrapper[4878]: I1202 18:47:00.973301 4878 scope.go:117] "RemoveContainer" containerID="acddf47d8dc17fbf73f5066ed0ee03e768da7e323d63ee095239fea44de8860b" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.026690 4878 scope.go:117] "RemoveContainer" containerID="5b75f46b42bf9e76396fc3715b5ecce3c4dc5aff981504b479d7ee504e022a2a" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.085363 4878 scope.go:117] "RemoveContainer" containerID="57d8e0fbcc18800318007afa266f816efa2832719ca31b15548a9e0773166b82" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.122190 4878 scope.go:117] "RemoveContainer" containerID="b078473c94ef910d7a256927181d9df5d9ce59f39541b2a61d3dd35020ce14e8" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.150759 4878 scope.go:117] "RemoveContainer" containerID="8b63053f335504bdf190c08d78bc636147acbca4ef391d17ac424c355cf31238" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.176576 4878 scope.go:117] "RemoveContainer" containerID="14f7cece6b5697d747ca8cf15ee1cb65a6386b9d78e0beecd6e48a6f7138bb4c" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.216559 4878 scope.go:117] "RemoveContainer" containerID="d0ea1247fbbf818dd4a6371bf20b1a42dece706dd27e4cbde88b3300ed50a9ca" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.243951 4878 scope.go:117] "RemoveContainer" containerID="e45a554f8a6aaa8a881fbafd9cf2ba4ebe921a5881acad4d3b5a899d9422ce75" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.277849 4878 scope.go:117] "RemoveContainer" containerID="c791e1705ad64a67a903f613f98c4720627ce60d7238181c34c86578c4fca8e8" Dec 02 18:47:01 crc kubenswrapper[4878]: I1202 18:47:01.303896 4878 scope.go:117] "RemoveContainer" containerID="67c773fa1b2388f25773245fceb5fa2df463d68bd0d343a40c4c7a4214c8fed8" Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.071945 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-b7e2-account-create-update-5hfz5"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.089850 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-6dj4k"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.106426 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-srxml"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.122720 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hldwm"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.138635 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-7e48-account-create-update-d5z4q"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.152836 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-b7e2-account-create-update-5hfz5"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.163883 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-7e48-account-create-update-d5z4q"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.175318 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-6dj4k"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.186356 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-srxml"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.196918 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hldwm"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.207791 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ppjq4"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.220635 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b33b-account-create-update-5f77r"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.232469 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b33b-account-create-update-5f77r"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.244558 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ppjq4"] Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.952646 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c2cda1-016a-4eca-9a7f-d4c2e43ccff1" path="/var/lib/kubelet/pods/36c2cda1-016a-4eca-9a7f-d4c2e43ccff1/volumes" Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.954845 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391eccae-595f-4859-b88b-933307305613" path="/var/lib/kubelet/pods/391eccae-595f-4859-b88b-933307305613/volumes" Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.956941 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f602d3-e640-4248-b53e-201b3556aa6f" path="/var/lib/kubelet/pods/56f602d3-e640-4248-b53e-201b3556aa6f/volumes" Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.959042 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628e53e1-08c1-49e9-8fec-b6e0547c9e6d" path="/var/lib/kubelet/pods/628e53e1-08c1-49e9-8fec-b6e0547c9e6d/volumes" Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.961740 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9669f740-c9d3-4c55-866d-a44471c3aa1c" path="/var/lib/kubelet/pods/9669f740-c9d3-4c55-866d-a44471c3aa1c/volumes" Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.963335 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae0102d-4eba-4ca3-915b-0156d65616fa" path="/var/lib/kubelet/pods/cae0102d-4eba-4ca3-915b-0156d65616fa/volumes" Dec 02 18:47:04 crc kubenswrapper[4878]: I1202 18:47:04.965021 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefde4cb-7350-4664-8f73-fabfecc591eb" path="/var/lib/kubelet/pods/cefde4cb-7350-4664-8f73-fabfecc591eb/volumes" Dec 02 18:47:09 crc kubenswrapper[4878]: I1202 18:47:09.939427 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:47:09 crc kubenswrapper[4878]: E1202 18:47:09.940724 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:47:14 crc kubenswrapper[4878]: I1202 18:47:14.048083 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-92nsw"] Dec 02 18:47:14 crc kubenswrapper[4878]: I1202 18:47:14.060571 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-92nsw"] Dec 02 18:47:14 crc kubenswrapper[4878]: I1202 18:47:14.963972 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9672009-e2c7-4540-91ea-737ef2418ac1" path="/var/lib/kubelet/pods/e9672009-e2c7-4540-91ea-737ef2418ac1/volumes" Dec 02 18:47:20 crc kubenswrapper[4878]: I1202 18:47:20.957715 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:47:20 crc kubenswrapper[4878]: E1202 18:47:20.958779 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:47:26 crc kubenswrapper[4878]: I1202 18:47:26.048555 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wkmpm"] Dec 02 18:47:26 crc kubenswrapper[4878]: I1202 18:47:26.064632 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wkmpm"] Dec 02 18:47:26 crc kubenswrapper[4878]: I1202 18:47:26.982747 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c0ade7f-399e-4b2f-a250-5f6a47e90baf" path="/var/lib/kubelet/pods/9c0ade7f-399e-4b2f-a250-5f6a47e90baf/volumes" Dec 02 18:47:32 crc kubenswrapper[4878]: I1202 18:47:32.938390 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:47:32 crc kubenswrapper[4878]: E1202 18:47:32.939340 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:47:45 crc kubenswrapper[4878]: I1202 18:47:45.939021 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:47:45 crc kubenswrapper[4878]: E1202 18:47:45.940280 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:47:48 crc kubenswrapper[4878]: I1202 18:47:48.052936 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4czgq"] Dec 02 18:47:48 crc kubenswrapper[4878]: I1202 18:47:48.066182 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4czgq"] Dec 02 18:47:48 crc kubenswrapper[4878]: I1202 18:47:48.955202 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727a4f99-6a27-4d95-a73f-92e9fb4b0500" path="/var/lib/kubelet/pods/727a4f99-6a27-4d95-a73f-92e9fb4b0500/volumes" Dec 02 18:47:59 crc kubenswrapper[4878]: I1202 18:47:59.940989 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.153798 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"5975f864ead5b21b2dbb178e245c083aa3ed8f628605531725f83f26b592f531"} Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.622988 4878 scope.go:117] "RemoveContainer" containerID="b10aee55cb1ebb6a6d6c82477341c1f3266fbcb29b4fe83eaa5e2bd23d527617" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.648511 4878 scope.go:117] "RemoveContainer" containerID="1ad976fc2bc596411fc3bcabc782d489cd58255538025830e853c1bd52c4a5de" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.671137 4878 scope.go:117] "RemoveContainer" containerID="781f7a301f7b90e4eea9fe07539b0aaa3c285bbebd9ba34ffdd8b7262ede5a49" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.735033 4878 scope.go:117] "RemoveContainer" containerID="a8a5801bbe92f5b112d99995fa026ee15743a7ee2b42de8f736e4a371b40e7fb" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.769071 4878 scope.go:117] "RemoveContainer" containerID="a17340f3089776901697d2a759a35338cb54fa2cfc6960e53b1881784621ce1c" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.798718 4878 scope.go:117] "RemoveContainer" containerID="2904b18252f988f050784a0e760f3d41816725358db0b549960fecf3d8318ff0" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.849969 4878 scope.go:117] "RemoveContainer" containerID="8bb5e914d72d79ab66da3660027d8f7b63ca795c94b88fe0dfc94f55126e1ffe" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.924199 4878 scope.go:117] "RemoveContainer" containerID="42691b44e01ae15ba0eccbd2f9676f2adc242b5c2b983dddaecb6272ce1fb285" Dec 02 18:48:01 crc kubenswrapper[4878]: I1202 18:48:01.978328 4878 scope.go:117] "RemoveContainer" containerID="0d570307ab468142cf95b07cf60a044645f60c831a219c7023be724b3072c59c" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.061495 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6t8ds"] Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.061755 4878 scope.go:117] "RemoveContainer" containerID="cfb044178cd269dfa607bc6435f583099465f52b83abe503abfd8ff6081b84b4" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.083978 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-h6lds"] Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.091470 4878 scope.go:117] "RemoveContainer" containerID="c744fc7a4745cb90252fa158ae2d36fed75660311b4454064242dd13616b386c" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.097009 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x42mc"] Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.108628 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6t8ds"] Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.113676 4878 scope.go:117] "RemoveContainer" containerID="0c18d7b52deca67ac54e1322a747c8b8a715b13364819888fb0266810ff89f44" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.123302 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-h6lds"] Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.137525 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x42mc"] Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.151389 4878 scope.go:117] "RemoveContainer" containerID="2c35cc5e2f37dc948eefb0fac2bc88ed1108661dde6f89516e1c5449e16ade97" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.187374 4878 scope.go:117] "RemoveContainer" containerID="f024efb68a03b29cc1f42226e55c4ef21f0ca1a387ea330c153c02ca1080382b" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.952952 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="670fbeeb-ca87-4024-b8d5-ddd470241386" path="/var/lib/kubelet/pods/670fbeeb-ca87-4024-b8d5-ddd470241386/volumes" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.954293 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c702f83c-0fa3-4ba7-b525-5eddf84355a8" path="/var/lib/kubelet/pods/c702f83c-0fa3-4ba7-b525-5eddf84355a8/volumes" Dec 02 18:48:02 crc kubenswrapper[4878]: I1202 18:48:02.954980 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bdf1c8-9578-4295-a835-432049516d07" path="/var/lib/kubelet/pods/d0bdf1c8-9578-4295-a835-432049516d07/volumes" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.046582 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6qqjx"] Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.059647 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6qqjx"] Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.275791 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r6wm4"] Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.279887 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.304403 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6wm4"] Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.352177 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hhj6\" (UniqueName: \"kubernetes.io/projected/a3cc2629-febb-4c13-adc2-e6fd87479ac6-kube-api-access-8hhj6\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.352705 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-utilities\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.352848 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-catalog-content\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.457191 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hhj6\" (UniqueName: \"kubernetes.io/projected/a3cc2629-febb-4c13-adc2-e6fd87479ac6-kube-api-access-8hhj6\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.457313 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-utilities\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.457416 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-catalog-content\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.457919 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-utilities\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.457999 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-catalog-content\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.475710 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hhj6\" (UniqueName: \"kubernetes.io/projected/a3cc2629-febb-4c13-adc2-e6fd87479ac6-kube-api-access-8hhj6\") pod \"redhat-operators-r6wm4\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:29 crc kubenswrapper[4878]: I1202 18:48:29.638299 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:30 crc kubenswrapper[4878]: I1202 18:48:30.187692 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6wm4"] Dec 02 18:48:30 crc kubenswrapper[4878]: I1202 18:48:30.591953 4878 generic.go:334] "Generic (PLEG): container finished" podID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerID="374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b" exitCode=0 Dec 02 18:48:30 crc kubenswrapper[4878]: I1202 18:48:30.592022 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6wm4" event={"ID":"a3cc2629-febb-4c13-adc2-e6fd87479ac6","Type":"ContainerDied","Data":"374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b"} Dec 02 18:48:30 crc kubenswrapper[4878]: I1202 18:48:30.592263 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6wm4" event={"ID":"a3cc2629-febb-4c13-adc2-e6fd87479ac6","Type":"ContainerStarted","Data":"4c982258f099a7eb9b34678340e0e47f406fa5190714ca615b71fc6879c45f88"} Dec 02 18:48:30 crc kubenswrapper[4878]: I1202 18:48:30.593859 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 18:48:30 crc kubenswrapper[4878]: I1202 18:48:30.950767 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7252936-ed87-47f7-b392-7d8fe8388279" path="/var/lib/kubelet/pods/d7252936-ed87-47f7-b392-7d8fe8388279/volumes" Dec 02 18:48:31 crc kubenswrapper[4878]: I1202 18:48:31.609028 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6wm4" event={"ID":"a3cc2629-febb-4c13-adc2-e6fd87479ac6","Type":"ContainerStarted","Data":"ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641"} Dec 02 18:48:36 crc kubenswrapper[4878]: I1202 18:48:36.173866 4878 generic.go:334] "Generic (PLEG): container finished" podID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerID="ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641" exitCode=0 Dec 02 18:48:36 crc kubenswrapper[4878]: I1202 18:48:36.173910 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6wm4" event={"ID":"a3cc2629-febb-4c13-adc2-e6fd87479ac6","Type":"ContainerDied","Data":"ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641"} Dec 02 18:48:38 crc kubenswrapper[4878]: I1202 18:48:38.204420 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6wm4" event={"ID":"a3cc2629-febb-4c13-adc2-e6fd87479ac6","Type":"ContainerStarted","Data":"938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac"} Dec 02 18:48:38 crc kubenswrapper[4878]: I1202 18:48:38.239631 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r6wm4" podStartSLOduration=3.080065739 podStartE2EDuration="9.239609151s" podCreationTimestamp="2025-12-02 18:48:29 +0000 UTC" firstStartedPulling="2025-12-02 18:48:30.59359657 +0000 UTC m=+2020.283215451" lastFinishedPulling="2025-12-02 18:48:36.753139952 +0000 UTC m=+2026.442758863" observedRunningTime="2025-12-02 18:48:38.22866775 +0000 UTC m=+2027.918286641" watchObservedRunningTime="2025-12-02 18:48:38.239609151 +0000 UTC m=+2027.929228032" Dec 02 18:48:39 crc kubenswrapper[4878]: I1202 18:48:39.638940 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:39 crc kubenswrapper[4878]: I1202 18:48:39.639227 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:40 crc kubenswrapper[4878]: I1202 18:48:40.715548 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r6wm4" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="registry-server" probeResult="failure" output=< Dec 02 18:48:40 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 18:48:40 crc kubenswrapper[4878]: > Dec 02 18:48:49 crc kubenswrapper[4878]: I1202 18:48:49.358592 4878 generic.go:334] "Generic (PLEG): container finished" podID="7b2e71d7-3e0b-4bfa-835c-374dcf03dd86" containerID="ba1b30d1fff2ab4330994337d9c61128a286262c48d5df81190e5761bf3f88d2" exitCode=0 Dec 02 18:48:49 crc kubenswrapper[4878]: I1202 18:48:49.359104 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" event={"ID":"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86","Type":"ContainerDied","Data":"ba1b30d1fff2ab4330994337d9c61128a286262c48d5df81190e5761bf3f88d2"} Dec 02 18:48:49 crc kubenswrapper[4878]: I1202 18:48:49.686083 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:49 crc kubenswrapper[4878]: I1202 18:48:49.790618 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:49 crc kubenswrapper[4878]: I1202 18:48:49.927891 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6wm4"] Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.117034 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.310195 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-ssh-key\") pod \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.310632 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-inventory\") pod \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.310767 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whvwc\" (UniqueName: \"kubernetes.io/projected/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-kube-api-access-whvwc\") pod \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\" (UID: \"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86\") " Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.322118 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-kube-api-access-whvwc" (OuterVolumeSpecName: "kube-api-access-whvwc") pod "7b2e71d7-3e0b-4bfa-835c-374dcf03dd86" (UID: "7b2e71d7-3e0b-4bfa-835c-374dcf03dd86"). InnerVolumeSpecName "kube-api-access-whvwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.346906 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-inventory" (OuterVolumeSpecName: "inventory") pod "7b2e71d7-3e0b-4bfa-835c-374dcf03dd86" (UID: "7b2e71d7-3e0b-4bfa-835c-374dcf03dd86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.356860 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b2e71d7-3e0b-4bfa-835c-374dcf03dd86" (UID: "7b2e71d7-3e0b-4bfa-835c-374dcf03dd86"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.386142 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" event={"ID":"7b2e71d7-3e0b-4bfa-835c-374dcf03dd86","Type":"ContainerDied","Data":"01fb9ab8429048443a164055df0e98111a88e29e2cd3602ec39fb7355405308b"} Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.386210 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01fb9ab8429048443a164055df0e98111a88e29e2cd3602ec39fb7355405308b" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.386166 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-l59tj" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.386356 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r6wm4" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="registry-server" containerID="cri-o://938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac" gracePeriod=2 Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.414048 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.414103 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.414119 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whvwc\" (UniqueName: \"kubernetes.io/projected/7b2e71d7-3e0b-4bfa-835c-374dcf03dd86-kube-api-access-whvwc\") on node \"crc\" DevicePath \"\"" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.475539 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z"] Dec 02 18:48:51 crc kubenswrapper[4878]: E1202 18:48:51.476199 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2e71d7-3e0b-4bfa-835c-374dcf03dd86" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.476218 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2e71d7-3e0b-4bfa-835c-374dcf03dd86" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.476566 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2e71d7-3e0b-4bfa-835c-374dcf03dd86" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.477576 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.480699 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.481651 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.481783 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.481891 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.500337 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z"] Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.524670 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.525216 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjrnj\" (UniqueName: \"kubernetes.io/projected/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-kube-api-access-hjrnj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.525409 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.626900 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjrnj\" (UniqueName: \"kubernetes.io/projected/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-kube-api-access-hjrnj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.626968 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.627093 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.632902 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.638101 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.652080 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjrnj\" (UniqueName: \"kubernetes.io/projected/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-kube-api-access-hjrnj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cf87z\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.897116 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:51 crc kubenswrapper[4878]: I1202 18:48:51.900692 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.037205 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hhj6\" (UniqueName: \"kubernetes.io/projected/a3cc2629-febb-4c13-adc2-e6fd87479ac6-kube-api-access-8hhj6\") pod \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.038349 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-catalog-content\") pod \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.038487 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-utilities\") pod \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\" (UID: \"a3cc2629-febb-4c13-adc2-e6fd87479ac6\") " Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.040200 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-utilities" (OuterVolumeSpecName: "utilities") pod "a3cc2629-febb-4c13-adc2-e6fd87479ac6" (UID: "a3cc2629-febb-4c13-adc2-e6fd87479ac6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.040953 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.061481 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3cc2629-febb-4c13-adc2-e6fd87479ac6-kube-api-access-8hhj6" (OuterVolumeSpecName: "kube-api-access-8hhj6") pod "a3cc2629-febb-4c13-adc2-e6fd87479ac6" (UID: "a3cc2629-febb-4c13-adc2-e6fd87479ac6"). InnerVolumeSpecName "kube-api-access-8hhj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.143218 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hhj6\" (UniqueName: \"kubernetes.io/projected/a3cc2629-febb-4c13-adc2-e6fd87479ac6-kube-api-access-8hhj6\") on node \"crc\" DevicePath \"\"" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.177815 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3cc2629-febb-4c13-adc2-e6fd87479ac6" (UID: "a3cc2629-febb-4c13-adc2-e6fd87479ac6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.247157 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc2629-febb-4c13-adc2-e6fd87479ac6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.405121 4878 generic.go:334] "Generic (PLEG): container finished" podID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerID="938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac" exitCode=0 Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.405185 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6wm4" event={"ID":"a3cc2629-febb-4c13-adc2-e6fd87479ac6","Type":"ContainerDied","Data":"938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac"} Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.405277 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6wm4" event={"ID":"a3cc2629-febb-4c13-adc2-e6fd87479ac6","Type":"ContainerDied","Data":"4c982258f099a7eb9b34678340e0e47f406fa5190714ca615b71fc6879c45f88"} Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.405309 4878 scope.go:117] "RemoveContainer" containerID="938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.405554 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6wm4" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.461431 4878 scope.go:117] "RemoveContainer" containerID="ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.471850 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6wm4"] Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.495147 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r6wm4"] Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.496013 4878 scope.go:117] "RemoveContainer" containerID="374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.551477 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z"] Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.568785 4878 scope.go:117] "RemoveContainer" containerID="938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac" Dec 02 18:48:52 crc kubenswrapper[4878]: E1202 18:48:52.569198 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac\": container with ID starting with 938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac not found: ID does not exist" containerID="938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.569272 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac"} err="failed to get container status \"938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac\": rpc error: code = NotFound desc = could not find container \"938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac\": container with ID starting with 938b9a48c9ce9eae3414ee6dbed2b07f721bca97228a0500faec781b823e03ac not found: ID does not exist" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.569307 4878 scope.go:117] "RemoveContainer" containerID="ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641" Dec 02 18:48:52 crc kubenswrapper[4878]: E1202 18:48:52.569751 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641\": container with ID starting with ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641 not found: ID does not exist" containerID="ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.569814 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641"} err="failed to get container status \"ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641\": rpc error: code = NotFound desc = could not find container \"ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641\": container with ID starting with ed211aee7d4f8f248ebca07f740d1dc6fdf6f1af6212fe0876e58d57f075a641 not found: ID does not exist" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.569865 4878 scope.go:117] "RemoveContainer" containerID="374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b" Dec 02 18:48:52 crc kubenswrapper[4878]: E1202 18:48:52.570305 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b\": container with ID starting with 374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b not found: ID does not exist" containerID="374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.570348 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b"} err="failed to get container status \"374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b\": rpc error: code = NotFound desc = could not find container \"374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b\": container with ID starting with 374eca3c802d4ef4735ed9e485357cbd45d2d4fbd1fcb5758a7f974e84cb476b not found: ID does not exist" Dec 02 18:48:52 crc kubenswrapper[4878]: I1202 18:48:52.957102 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" path="/var/lib/kubelet/pods/a3cc2629-febb-4c13-adc2-e6fd87479ac6/volumes" Dec 02 18:48:53 crc kubenswrapper[4878]: I1202 18:48:53.421798 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" event={"ID":"2f8b1f89-0eef-426b-8bb0-8700c42ede2e","Type":"ContainerStarted","Data":"001d6c7345a42927c7288257db26330b9607bf0d4f64393c45cc614da0328845"} Dec 02 18:48:54 crc kubenswrapper[4878]: I1202 18:48:54.472102 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" event={"ID":"2f8b1f89-0eef-426b-8bb0-8700c42ede2e","Type":"ContainerStarted","Data":"8f8dc6852007bda7a3c5b49ae167d64b6b88e63b3b94d08c27996bc459f50c54"} Dec 02 18:48:54 crc kubenswrapper[4878]: I1202 18:48:54.506558 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" podStartSLOduration=2.855267801 podStartE2EDuration="3.506527493s" podCreationTimestamp="2025-12-02 18:48:51 +0000 UTC" firstStartedPulling="2025-12-02 18:48:52.569001421 +0000 UTC m=+2042.258620322" lastFinishedPulling="2025-12-02 18:48:53.220261123 +0000 UTC m=+2042.909880014" observedRunningTime="2025-12-02 18:48:54.49038456 +0000 UTC m=+2044.180003441" watchObservedRunningTime="2025-12-02 18:48:54.506527493 +0000 UTC m=+2044.196146384" Dec 02 18:49:02 crc kubenswrapper[4878]: I1202 18:49:02.444891 4878 scope.go:117] "RemoveContainer" containerID="0b5dc9ab2161e6b65edb6dc0055e1a8cb2f405a4236c70ece22a0f7aa3ac35c1" Dec 02 18:49:02 crc kubenswrapper[4878]: I1202 18:49:02.589397 4878 scope.go:117] "RemoveContainer" containerID="d13e1834896505007437ee3e8dcb1e0c0c9663ed054383ffe0ba2c61aa44ae40" Dec 02 18:49:02 crc kubenswrapper[4878]: I1202 18:49:02.646091 4878 scope.go:117] "RemoveContainer" containerID="bea3d4d9b0e638bd1c0e1ca7227f69f3d655d8f3beaeabc0bd0056f9a2690148" Dec 02 18:49:02 crc kubenswrapper[4878]: I1202 18:49:02.681495 4878 scope.go:117] "RemoveContainer" containerID="2b24a441c9b26e2ace023f0e8d27ed879b706b1209de208ec786997acbf8cbf5" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.123128 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qmlf"] Dec 02 18:49:12 crc kubenswrapper[4878]: E1202 18:49:12.124394 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="extract-utilities" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.124414 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="extract-utilities" Dec 02 18:49:12 crc kubenswrapper[4878]: E1202 18:49:12.124461 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="registry-server" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.124469 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="registry-server" Dec 02 18:49:12 crc kubenswrapper[4878]: E1202 18:49:12.124506 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="extract-content" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.124513 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="extract-content" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.125280 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cc2629-febb-4c13-adc2-e6fd87479ac6" containerName="registry-server" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.127793 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.144132 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qmlf"] Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.309999 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-utilities\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.310380 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-catalog-content\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.310409 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22nzw\" (UniqueName: \"kubernetes.io/projected/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-kube-api-access-22nzw\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.413427 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-utilities\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.413631 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-catalog-content\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.413674 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22nzw\" (UniqueName: \"kubernetes.io/projected/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-kube-api-access-22nzw\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.413945 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-utilities\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.414154 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-catalog-content\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.445409 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22nzw\" (UniqueName: \"kubernetes.io/projected/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-kube-api-access-22nzw\") pod \"certified-operators-9qmlf\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:12 crc kubenswrapper[4878]: I1202 18:49:12.458724 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:13 crc kubenswrapper[4878]: I1202 18:49:13.037132 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qmlf"] Dec 02 18:49:13 crc kubenswrapper[4878]: I1202 18:49:13.801644 4878 generic.go:334] "Generic (PLEG): container finished" podID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerID="7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47" exitCode=0 Dec 02 18:49:13 crc kubenswrapper[4878]: I1202 18:49:13.801847 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qmlf" event={"ID":"8de7d216-e31c-4ccc-956b-0de1c1ae4d28","Type":"ContainerDied","Data":"7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47"} Dec 02 18:49:13 crc kubenswrapper[4878]: I1202 18:49:13.801933 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qmlf" event={"ID":"8de7d216-e31c-4ccc-956b-0de1c1ae4d28","Type":"ContainerStarted","Data":"fb6351faad54d25c36e6c12476e0fa15ad1e301c381bec22c3584e02f79c7954"} Dec 02 18:49:14 crc kubenswrapper[4878]: I1202 18:49:14.816598 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qmlf" event={"ID":"8de7d216-e31c-4ccc-956b-0de1c1ae4d28","Type":"ContainerStarted","Data":"99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05"} Dec 02 18:49:15 crc kubenswrapper[4878]: I1202 18:49:15.829112 4878 generic.go:334] "Generic (PLEG): container finished" podID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerID="99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05" exitCode=0 Dec 02 18:49:15 crc kubenswrapper[4878]: I1202 18:49:15.829225 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qmlf" event={"ID":"8de7d216-e31c-4ccc-956b-0de1c1ae4d28","Type":"ContainerDied","Data":"99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05"} Dec 02 18:49:16 crc kubenswrapper[4878]: I1202 18:49:16.862011 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qmlf" event={"ID":"8de7d216-e31c-4ccc-956b-0de1c1ae4d28","Type":"ContainerStarted","Data":"642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534"} Dec 02 18:49:16 crc kubenswrapper[4878]: I1202 18:49:16.888020 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qmlf" podStartSLOduration=2.426828291 podStartE2EDuration="4.887989587s" podCreationTimestamp="2025-12-02 18:49:12 +0000 UTC" firstStartedPulling="2025-12-02 18:49:13.804632404 +0000 UTC m=+2063.494251285" lastFinishedPulling="2025-12-02 18:49:16.2657937 +0000 UTC m=+2065.955412581" observedRunningTime="2025-12-02 18:49:16.884125476 +0000 UTC m=+2066.573744367" watchObservedRunningTime="2025-12-02 18:49:16.887989587 +0000 UTC m=+2066.577608468" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.649589 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfd8n"] Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.653981 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.676730 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfd8n"] Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.738501 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-catalog-content\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.739136 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnxs5\" (UniqueName: \"kubernetes.io/projected/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-kube-api-access-lnxs5\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.739274 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-utilities\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.842083 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnxs5\" (UniqueName: \"kubernetes.io/projected/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-kube-api-access-lnxs5\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.842221 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-utilities\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.842398 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-catalog-content\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.842805 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-utilities\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.842824 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-catalog-content\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:20 crc kubenswrapper[4878]: I1202 18:49:20.871300 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnxs5\" (UniqueName: \"kubernetes.io/projected/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-kube-api-access-lnxs5\") pod \"redhat-marketplace-sfd8n\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:21 crc kubenswrapper[4878]: I1202 18:49:21.001413 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:21 crc kubenswrapper[4878]: I1202 18:49:21.562296 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfd8n"] Dec 02 18:49:21 crc kubenswrapper[4878]: I1202 18:49:21.938475 4878 generic.go:334] "Generic (PLEG): container finished" podID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerID="00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e" exitCode=0 Dec 02 18:49:21 crc kubenswrapper[4878]: I1202 18:49:21.938532 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfd8n" event={"ID":"d0c85529-f0f9-43b3-9ce8-0129b9da83a4","Type":"ContainerDied","Data":"00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e"} Dec 02 18:49:21 crc kubenswrapper[4878]: I1202 18:49:21.938556 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfd8n" event={"ID":"d0c85529-f0f9-43b3-9ce8-0129b9da83a4","Type":"ContainerStarted","Data":"b50c2784b792e71f23519c0672d7008da7289c0a054cb3c254de747fb90e0076"} Dec 02 18:49:22 crc kubenswrapper[4878]: I1202 18:49:22.459840 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:22 crc kubenswrapper[4878]: I1202 18:49:22.460290 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:22 crc kubenswrapper[4878]: I1202 18:49:22.550509 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:22 crc kubenswrapper[4878]: I1202 18:49:22.965595 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfd8n" event={"ID":"d0c85529-f0f9-43b3-9ce8-0129b9da83a4","Type":"ContainerStarted","Data":"8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff"} Dec 02 18:49:23 crc kubenswrapper[4878]: I1202 18:49:23.039933 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:23 crc kubenswrapper[4878]: I1202 18:49:23.971042 4878 generic.go:334] "Generic (PLEG): container finished" podID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerID="8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff" exitCode=0 Dec 02 18:49:23 crc kubenswrapper[4878]: I1202 18:49:23.971173 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfd8n" event={"ID":"d0c85529-f0f9-43b3-9ce8-0129b9da83a4","Type":"ContainerDied","Data":"8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff"} Dec 02 18:49:24 crc kubenswrapper[4878]: I1202 18:49:24.815021 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qmlf"] Dec 02 18:49:24 crc kubenswrapper[4878]: I1202 18:49:24.987976 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfd8n" event={"ID":"d0c85529-f0f9-43b3-9ce8-0129b9da83a4","Type":"ContainerStarted","Data":"17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535"} Dec 02 18:49:24 crc kubenswrapper[4878]: I1202 18:49:24.988132 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qmlf" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="registry-server" containerID="cri-o://642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534" gracePeriod=2 Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.028754 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfd8n" podStartSLOduration=2.502579321 podStartE2EDuration="5.028688282s" podCreationTimestamp="2025-12-02 18:49:20 +0000 UTC" firstStartedPulling="2025-12-02 18:49:21.940596447 +0000 UTC m=+2071.630215328" lastFinishedPulling="2025-12-02 18:49:24.466705378 +0000 UTC m=+2074.156324289" observedRunningTime="2025-12-02 18:49:25.016958836 +0000 UTC m=+2074.706577727" watchObservedRunningTime="2025-12-02 18:49:25.028688282 +0000 UTC m=+2074.718307173" Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.511846 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.682109 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-utilities\") pod \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.682829 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-catalog-content\") pod \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.683167 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22nzw\" (UniqueName: \"kubernetes.io/projected/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-kube-api-access-22nzw\") pod \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\" (UID: \"8de7d216-e31c-4ccc-956b-0de1c1ae4d28\") " Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.683324 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-utilities" (OuterVolumeSpecName: "utilities") pod "8de7d216-e31c-4ccc-956b-0de1c1ae4d28" (UID: "8de7d216-e31c-4ccc-956b-0de1c1ae4d28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.684281 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.693620 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-kube-api-access-22nzw" (OuterVolumeSpecName: "kube-api-access-22nzw") pod "8de7d216-e31c-4ccc-956b-0de1c1ae4d28" (UID: "8de7d216-e31c-4ccc-956b-0de1c1ae4d28"). InnerVolumeSpecName "kube-api-access-22nzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.737838 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8de7d216-e31c-4ccc-956b-0de1c1ae4d28" (UID: "8de7d216-e31c-4ccc-956b-0de1c1ae4d28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.786645 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:49:25 crc kubenswrapper[4878]: I1202 18:49:25.786904 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22nzw\" (UniqueName: \"kubernetes.io/projected/8de7d216-e31c-4ccc-956b-0de1c1ae4d28-kube-api-access-22nzw\") on node \"crc\" DevicePath \"\"" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.002211 4878 generic.go:334] "Generic (PLEG): container finished" podID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerID="642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534" exitCode=0 Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.002322 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qmlf" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.002348 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qmlf" event={"ID":"8de7d216-e31c-4ccc-956b-0de1c1ae4d28","Type":"ContainerDied","Data":"642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534"} Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.003567 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qmlf" event={"ID":"8de7d216-e31c-4ccc-956b-0de1c1ae4d28","Type":"ContainerDied","Data":"fb6351faad54d25c36e6c12476e0fa15ad1e301c381bec22c3584e02f79c7954"} Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.003592 4878 scope.go:117] "RemoveContainer" containerID="642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.029035 4878 scope.go:117] "RemoveContainer" containerID="99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.053132 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qmlf"] Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.072725 4878 scope.go:117] "RemoveContainer" containerID="7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.076439 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qmlf"] Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.120862 4878 scope.go:117] "RemoveContainer" containerID="642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534" Dec 02 18:49:26 crc kubenswrapper[4878]: E1202 18:49:26.122250 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534\": container with ID starting with 642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534 not found: ID does not exist" containerID="642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.122314 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534"} err="failed to get container status \"642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534\": rpc error: code = NotFound desc = could not find container \"642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534\": container with ID starting with 642c53da8eaca157a366fd05db9ff8df8b2033b6b4f534dd2eaf69e74ebd2534 not found: ID does not exist" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.122361 4878 scope.go:117] "RemoveContainer" containerID="99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05" Dec 02 18:49:26 crc kubenswrapper[4878]: E1202 18:49:26.122906 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05\": container with ID starting with 99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05 not found: ID does not exist" containerID="99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.122948 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05"} err="failed to get container status \"99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05\": rpc error: code = NotFound desc = could not find container \"99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05\": container with ID starting with 99e2728f84068946072e175227d144ce93b88e63c24e1cb1e3a1a07cb12bde05 not found: ID does not exist" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.122974 4878 scope.go:117] "RemoveContainer" containerID="7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47" Dec 02 18:49:26 crc kubenswrapper[4878]: E1202 18:49:26.123224 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47\": container with ID starting with 7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47 not found: ID does not exist" containerID="7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.123277 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47"} err="failed to get container status \"7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47\": rpc error: code = NotFound desc = could not find container \"7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47\": container with ID starting with 7e19a4f4f607ccdfd64b5a23addb0da5f9f643983db07b8f2c279b70d9cd8c47 not found: ID does not exist" Dec 02 18:49:26 crc kubenswrapper[4878]: I1202 18:49:26.957112 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" path="/var/lib/kubelet/pods/8de7d216-e31c-4ccc-956b-0de1c1ae4d28/volumes" Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.069230 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5vk6v"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.095292 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jfrhp"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.114839 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-49dbg"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.129606 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0faa-account-create-update-tdxld"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.141926 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d658-account-create-update-lrvbf"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.157767 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9c84-account-create-update-jft6n"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.169689 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-49dbg"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.182111 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d658-account-create-update-lrvbf"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.191740 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9c84-account-create-update-jft6n"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.200214 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5vk6v"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.208665 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jfrhp"] Dec 02 18:49:27 crc kubenswrapper[4878]: I1202 18:49:27.217453 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0faa-account-create-update-tdxld"] Dec 02 18:49:28 crc kubenswrapper[4878]: I1202 18:49:28.965348 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fae1bb0-8b82-44a5-871d-45252562e8a7" path="/var/lib/kubelet/pods/0fae1bb0-8b82-44a5-871d-45252562e8a7/volumes" Dec 02 18:49:28 crc kubenswrapper[4878]: I1202 18:49:28.967848 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a22bf8-3e2b-4bd7-bbda-2cf7301065f4" path="/var/lib/kubelet/pods/51a22bf8-3e2b-4bd7-bbda-2cf7301065f4/volumes" Dec 02 18:49:28 crc kubenswrapper[4878]: I1202 18:49:28.969317 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b" path="/var/lib/kubelet/pods/b67ba6b8-6b1e-4480-a3a4-15d78bf30a7b/volumes" Dec 02 18:49:28 crc kubenswrapper[4878]: I1202 18:49:28.971012 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b863f92e-dc47-4a8e-b9ae-31baafb9ec79" path="/var/lib/kubelet/pods/b863f92e-dc47-4a8e-b9ae-31baafb9ec79/volumes" Dec 02 18:49:28 crc kubenswrapper[4878]: I1202 18:49:28.976021 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5fe1e3-5ab4-40e9-a902-f1b444bd005f" path="/var/lib/kubelet/pods/de5fe1e3-5ab4-40e9-a902-f1b444bd005f/volumes" Dec 02 18:49:28 crc kubenswrapper[4878]: I1202 18:49:28.978478 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe98b91-cced-4d28-b2cb-9e19d827a817" path="/var/lib/kubelet/pods/ebe98b91-cced-4d28-b2cb-9e19d827a817/volumes" Dec 02 18:49:31 crc kubenswrapper[4878]: I1202 18:49:31.001933 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:31 crc kubenswrapper[4878]: I1202 18:49:31.005139 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:31 crc kubenswrapper[4878]: I1202 18:49:31.076583 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:31 crc kubenswrapper[4878]: I1202 18:49:31.138295 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:31 crc kubenswrapper[4878]: I1202 18:49:31.325211 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfd8n"] Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.102194 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfd8n" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="registry-server" containerID="cri-o://17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535" gracePeriod=2 Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.661641 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.826213 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-catalog-content\") pod \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.826489 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnxs5\" (UniqueName: \"kubernetes.io/projected/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-kube-api-access-lnxs5\") pod \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.826566 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-utilities\") pod \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\" (UID: \"d0c85529-f0f9-43b3-9ce8-0129b9da83a4\") " Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.827557 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-utilities" (OuterVolumeSpecName: "utilities") pod "d0c85529-f0f9-43b3-9ce8-0129b9da83a4" (UID: "d0c85529-f0f9-43b3-9ce8-0129b9da83a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.840986 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-kube-api-access-lnxs5" (OuterVolumeSpecName: "kube-api-access-lnxs5") pod "d0c85529-f0f9-43b3-9ce8-0129b9da83a4" (UID: "d0c85529-f0f9-43b3-9ce8-0129b9da83a4"). InnerVolumeSpecName "kube-api-access-lnxs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.847118 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0c85529-f0f9-43b3-9ce8-0129b9da83a4" (UID: "d0c85529-f0f9-43b3-9ce8-0129b9da83a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.936030 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.936081 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnxs5\" (UniqueName: \"kubernetes.io/projected/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-kube-api-access-lnxs5\") on node \"crc\" DevicePath \"\"" Dec 02 18:49:33 crc kubenswrapper[4878]: I1202 18:49:33.936097 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0c85529-f0f9-43b3-9ce8-0129b9da83a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.112679 4878 generic.go:334] "Generic (PLEG): container finished" podID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerID="17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535" exitCode=0 Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.112777 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfd8n" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.112748 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfd8n" event={"ID":"d0c85529-f0f9-43b3-9ce8-0129b9da83a4","Type":"ContainerDied","Data":"17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535"} Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.112914 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfd8n" event={"ID":"d0c85529-f0f9-43b3-9ce8-0129b9da83a4","Type":"ContainerDied","Data":"b50c2784b792e71f23519c0672d7008da7289c0a054cb3c254de747fb90e0076"} Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.112959 4878 scope.go:117] "RemoveContainer" containerID="17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.146976 4878 scope.go:117] "RemoveContainer" containerID="8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.162733 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfd8n"] Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.172858 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfd8n"] Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.185776 4878 scope.go:117] "RemoveContainer" containerID="00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.250425 4878 scope.go:117] "RemoveContainer" containerID="17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535" Dec 02 18:49:34 crc kubenswrapper[4878]: E1202 18:49:34.253338 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535\": container with ID starting with 17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535 not found: ID does not exist" containerID="17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.253399 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535"} err="failed to get container status \"17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535\": rpc error: code = NotFound desc = could not find container \"17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535\": container with ID starting with 17f09279c6dd3ba4472962ee4aff05d3f10bae85fefb89b5480dd93d5aa3b535 not found: ID does not exist" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.253444 4878 scope.go:117] "RemoveContainer" containerID="8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff" Dec 02 18:49:34 crc kubenswrapper[4878]: E1202 18:49:34.254117 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff\": container with ID starting with 8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff not found: ID does not exist" containerID="8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.254176 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff"} err="failed to get container status \"8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff\": rpc error: code = NotFound desc = could not find container \"8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff\": container with ID starting with 8e80e96f08d494c8c755ae897ce063c26a944e774c34623ab90e0bbde3779eff not found: ID does not exist" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.254213 4878 scope.go:117] "RemoveContainer" containerID="00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e" Dec 02 18:49:34 crc kubenswrapper[4878]: E1202 18:49:34.260093 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e\": container with ID starting with 00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e not found: ID does not exist" containerID="00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.260138 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e"} err="failed to get container status \"00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e\": rpc error: code = NotFound desc = could not find container \"00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e\": container with ID starting with 00c25dbbaad491391b6db6cc4026330099d1da790274691845f3358565dc591e not found: ID does not exist" Dec 02 18:49:34 crc kubenswrapper[4878]: I1202 18:49:34.960720 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" path="/var/lib/kubelet/pods/d0c85529-f0f9-43b3-9ce8-0129b9da83a4/volumes" Dec 02 18:49:49 crc kubenswrapper[4878]: I1202 18:49:49.043097 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-e087-account-create-update-26jph"] Dec 02 18:49:49 crc kubenswrapper[4878]: I1202 18:49:49.055599 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-5llm6"] Dec 02 18:49:49 crc kubenswrapper[4878]: I1202 18:49:49.065796 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-5llm6"] Dec 02 18:49:49 crc kubenswrapper[4878]: I1202 18:49:49.076798 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-e087-account-create-update-26jph"] Dec 02 18:49:50 crc kubenswrapper[4878]: I1202 18:49:50.989163 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781a418d-080c-4487-b8df-9f33d7e2caa8" path="/var/lib/kubelet/pods/781a418d-080c-4487-b8df-9f33d7e2caa8/volumes" Dec 02 18:49:50 crc kubenswrapper[4878]: I1202 18:49:50.990863 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcceb81c-7764-496d-8695-70e73d5a6ce9" path="/var/lib/kubelet/pods/bcceb81c-7764-496d-8695-70e73d5a6ce9/volumes" Dec 02 18:49:58 crc kubenswrapper[4878]: I1202 18:49:58.046936 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l8xrz"] Dec 02 18:49:58 crc kubenswrapper[4878]: I1202 18:49:58.059516 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-l8xrz"] Dec 02 18:49:58 crc kubenswrapper[4878]: I1202 18:49:58.965609 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd" path="/var/lib/kubelet/pods/ed70a9c4-6719-4a2f-a7b2-1e6cdfb370bd/volumes" Dec 02 18:50:02 crc kubenswrapper[4878]: I1202 18:50:02.858585 4878 scope.go:117] "RemoveContainer" containerID="39af2f3b2e4cb81d0db802b1c60b2ba35f1c3aa0de46d79548564efc545cdb50" Dec 02 18:50:02 crc kubenswrapper[4878]: I1202 18:50:02.905640 4878 scope.go:117] "RemoveContainer" containerID="a6ded572eededc1ff51f73116a21e19978d27d32b12f5ab60352586d90251cf3" Dec 02 18:50:02 crc kubenswrapper[4878]: I1202 18:50:02.983668 4878 scope.go:117] "RemoveContainer" containerID="c902a8982e64a4d62748c1089097106615b250f89ab1e922cf98bb99f514e141" Dec 02 18:50:03 crc kubenswrapper[4878]: I1202 18:50:03.145081 4878 scope.go:117] "RemoveContainer" containerID="3ad46b90815ff249895d7a99e9f1cc105354805adaab707505502d3c335d3037" Dec 02 18:50:03 crc kubenswrapper[4878]: I1202 18:50:03.194772 4878 scope.go:117] "RemoveContainer" containerID="cbf268c1bad30d1042f263fc83d37c17103b2a5dce3b74e1f46d6044274aa11a" Dec 02 18:50:03 crc kubenswrapper[4878]: I1202 18:50:03.268820 4878 scope.go:117] "RemoveContainer" containerID="187259752d082987f31dc2461a6ee01d1dc982a58df2030a3772cf41396018d2" Dec 02 18:50:03 crc kubenswrapper[4878]: I1202 18:50:03.340633 4878 scope.go:117] "RemoveContainer" containerID="947f25a9149acfe163c64e8f3f3d5027fbf7f5140b7c4fd93eacf345ffdfb140" Dec 02 18:50:03 crc kubenswrapper[4878]: I1202 18:50:03.371981 4878 scope.go:117] "RemoveContainer" containerID="42d6183555157baf843267cc3d9359125e03c5ca496e039182e58bea08e3da02" Dec 02 18:50:03 crc kubenswrapper[4878]: I1202 18:50:03.399569 4878 scope.go:117] "RemoveContainer" containerID="73146bbc2ac059d567cfd3a3597b850aaae09a42e1da3f35e704ff5aafc41fb2" Dec 02 18:50:07 crc kubenswrapper[4878]: I1202 18:50:07.620338 4878 generic.go:334] "Generic (PLEG): container finished" podID="2f8b1f89-0eef-426b-8bb0-8700c42ede2e" containerID="8f8dc6852007bda7a3c5b49ae167d64b6b88e63b3b94d08c27996bc459f50c54" exitCode=0 Dec 02 18:50:07 crc kubenswrapper[4878]: I1202 18:50:07.620458 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" event={"ID":"2f8b1f89-0eef-426b-8bb0-8700c42ede2e","Type":"ContainerDied","Data":"8f8dc6852007bda7a3c5b49ae167d64b6b88e63b3b94d08c27996bc459f50c54"} Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.255544 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.410447 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-inventory\") pod \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.410601 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-ssh-key\") pod \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.410904 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjrnj\" (UniqueName: \"kubernetes.io/projected/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-kube-api-access-hjrnj\") pod \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\" (UID: \"2f8b1f89-0eef-426b-8bb0-8700c42ede2e\") " Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.420964 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-kube-api-access-hjrnj" (OuterVolumeSpecName: "kube-api-access-hjrnj") pod "2f8b1f89-0eef-426b-8bb0-8700c42ede2e" (UID: "2f8b1f89-0eef-426b-8bb0-8700c42ede2e"). InnerVolumeSpecName "kube-api-access-hjrnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.449293 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2f8b1f89-0eef-426b-8bb0-8700c42ede2e" (UID: "2f8b1f89-0eef-426b-8bb0-8700c42ede2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.452227 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-inventory" (OuterVolumeSpecName: "inventory") pod "2f8b1f89-0eef-426b-8bb0-8700c42ede2e" (UID: "2f8b1f89-0eef-426b-8bb0-8700c42ede2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.513573 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjrnj\" (UniqueName: \"kubernetes.io/projected/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-kube-api-access-hjrnj\") on node \"crc\" DevicePath \"\"" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.513607 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.513616 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f8b1f89-0eef-426b-8bb0-8700c42ede2e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.648930 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" event={"ID":"2f8b1f89-0eef-426b-8bb0-8700c42ede2e","Type":"ContainerDied","Data":"001d6c7345a42927c7288257db26330b9607bf0d4f64393c45cc614da0328845"} Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.649159 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="001d6c7345a42927c7288257db26330b9607bf0d4f64393c45cc614da0328845" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.649034 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cf87z" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.738987 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj"] Dec 02 18:50:09 crc kubenswrapper[4878]: E1202 18:50:09.745561 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="extract-content" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.745602 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="extract-content" Dec 02 18:50:09 crc kubenswrapper[4878]: E1202 18:50:09.745640 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="extract-content" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.745650 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="extract-content" Dec 02 18:50:09 crc kubenswrapper[4878]: E1202 18:50:09.745687 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="registry-server" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.745698 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="registry-server" Dec 02 18:50:09 crc kubenswrapper[4878]: E1202 18:50:09.745734 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="extract-utilities" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.745743 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="extract-utilities" Dec 02 18:50:09 crc kubenswrapper[4878]: E1202 18:50:09.745766 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8b1f89-0eef-426b-8bb0-8700c42ede2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.745776 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8b1f89-0eef-426b-8bb0-8700c42ede2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 18:50:09 crc kubenswrapper[4878]: E1202 18:50:09.745799 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="extract-utilities" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.745808 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="extract-utilities" Dec 02 18:50:09 crc kubenswrapper[4878]: E1202 18:50:09.745821 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="registry-server" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.745830 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="registry-server" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.752968 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8b1f89-0eef-426b-8bb0-8700c42ede2e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.753094 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c85529-f0f9-43b3-9ce8-0129b9da83a4" containerName="registry-server" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.753134 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de7d216-e31c-4ccc-956b-0de1c1ae4d28" containerName="registry-server" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.754988 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.766410 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.766791 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.767545 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.767796 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.788927 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj"] Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.822671 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.823452 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrg58\" (UniqueName: \"kubernetes.io/projected/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-kube-api-access-wrg58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.823615 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.926778 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.926984 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrg58\" (UniqueName: \"kubernetes.io/projected/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-kube-api-access-wrg58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.927020 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.936104 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.937722 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:09 crc kubenswrapper[4878]: I1202 18:50:09.942458 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrg58\" (UniqueName: \"kubernetes.io/projected/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-kube-api-access-wrg58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-46pqj\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:10 crc kubenswrapper[4878]: I1202 18:50:10.108982 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:10 crc kubenswrapper[4878]: W1202 18:50:10.841372 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad5d3ca4_7255_4bd3_9976_0834ea7b94ee.slice/crio-3bc6f373c836510eb406c3ca24ae3a449e5499028f77e90b6d7c0a99af8def91 WatchSource:0}: Error finding container 3bc6f373c836510eb406c3ca24ae3a449e5499028f77e90b6d7c0a99af8def91: Status 404 returned error can't find the container with id 3bc6f373c836510eb406c3ca24ae3a449e5499028f77e90b6d7c0a99af8def91 Dec 02 18:50:10 crc kubenswrapper[4878]: I1202 18:50:10.858621 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj"] Dec 02 18:50:11 crc kubenswrapper[4878]: I1202 18:50:11.688996 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" event={"ID":"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee","Type":"ContainerStarted","Data":"3bc6f373c836510eb406c3ca24ae3a449e5499028f77e90b6d7c0a99af8def91"} Dec 02 18:50:12 crc kubenswrapper[4878]: I1202 18:50:12.705019 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" event={"ID":"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee","Type":"ContainerStarted","Data":"21c8ba5784ff4b4b12822b51f6b1384a5380bbd75725ddccdef6f502d373e60d"} Dec 02 18:50:12 crc kubenswrapper[4878]: I1202 18:50:12.731448 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" podStartSLOduration=3.0206054 podStartE2EDuration="3.731221969s" podCreationTimestamp="2025-12-02 18:50:09 +0000 UTC" firstStartedPulling="2025-12-02 18:50:10.845046162 +0000 UTC m=+2120.534665053" lastFinishedPulling="2025-12-02 18:50:11.555662731 +0000 UTC m=+2121.245281622" observedRunningTime="2025-12-02 18:50:12.723641673 +0000 UTC m=+2122.413260574" watchObservedRunningTime="2025-12-02 18:50:12.731221969 +0000 UTC m=+2122.420840860" Dec 02 18:50:17 crc kubenswrapper[4878]: E1202 18:50:17.015359 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad5d3ca4_7255_4bd3_9976_0834ea7b94ee.slice/crio-21c8ba5784ff4b4b12822b51f6b1384a5380bbd75725ddccdef6f502d373e60d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad5d3ca4_7255_4bd3_9976_0834ea7b94ee.slice/crio-conmon-21c8ba5784ff4b4b12822b51f6b1384a5380bbd75725ddccdef6f502d373e60d.scope\": RecentStats: unable to find data in memory cache]" Dec 02 18:50:17 crc kubenswrapper[4878]: I1202 18:50:17.786111 4878 generic.go:334] "Generic (PLEG): container finished" podID="ad5d3ca4-7255-4bd3-9976-0834ea7b94ee" containerID="21c8ba5784ff4b4b12822b51f6b1384a5380bbd75725ddccdef6f502d373e60d" exitCode=0 Dec 02 18:50:17 crc kubenswrapper[4878]: I1202 18:50:17.786217 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" event={"ID":"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee","Type":"ContainerDied","Data":"21c8ba5784ff4b4b12822b51f6b1384a5380bbd75725ddccdef6f502d373e60d"} Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.421102 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.487012 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-ssh-key\") pod \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.487411 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrg58\" (UniqueName: \"kubernetes.io/projected/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-kube-api-access-wrg58\") pod \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.487555 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-inventory\") pod \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\" (UID: \"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee\") " Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.494909 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-kube-api-access-wrg58" (OuterVolumeSpecName: "kube-api-access-wrg58") pod "ad5d3ca4-7255-4bd3-9976-0834ea7b94ee" (UID: "ad5d3ca4-7255-4bd3-9976-0834ea7b94ee"). InnerVolumeSpecName "kube-api-access-wrg58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.526906 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad5d3ca4-7255-4bd3-9976-0834ea7b94ee" (UID: "ad5d3ca4-7255-4bd3-9976-0834ea7b94ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.529695 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-inventory" (OuterVolumeSpecName: "inventory") pod "ad5d3ca4-7255-4bd3-9976-0834ea7b94ee" (UID: "ad5d3ca4-7255-4bd3-9976-0834ea7b94ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.590365 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrg58\" (UniqueName: \"kubernetes.io/projected/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-kube-api-access-wrg58\") on node \"crc\" DevicePath \"\"" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.590421 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.590432 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad5d3ca4-7255-4bd3-9976-0834ea7b94ee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.812365 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" event={"ID":"ad5d3ca4-7255-4bd3-9976-0834ea7b94ee","Type":"ContainerDied","Data":"3bc6f373c836510eb406c3ca24ae3a449e5499028f77e90b6d7c0a99af8def91"} Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.812428 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bc6f373c836510eb406c3ca24ae3a449e5499028f77e90b6d7c0a99af8def91" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.812441 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-46pqj" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.917179 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts"] Dec 02 18:50:19 crc kubenswrapper[4878]: E1202 18:50:19.917725 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5d3ca4-7255-4bd3-9976-0834ea7b94ee" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.917742 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5d3ca4-7255-4bd3-9976-0834ea7b94ee" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.917972 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5d3ca4-7255-4bd3-9976-0834ea7b94ee" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.918831 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.924980 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.925327 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.925409 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.925518 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:50:19 crc kubenswrapper[4878]: I1202 18:50:19.937139 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts"] Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.000448 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.000638 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.000707 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5m8l\" (UniqueName: \"kubernetes.io/projected/0112601b-e2a2-4547-bea0-5afad959f726-kube-api-access-v5m8l\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.103689 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.103789 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5m8l\" (UniqueName: \"kubernetes.io/projected/0112601b-e2a2-4547-bea0-5afad959f726-kube-api-access-v5m8l\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.103867 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.109009 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.110581 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.139642 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5m8l\" (UniqueName: \"kubernetes.io/projected/0112601b-e2a2-4547-bea0-5afad959f726-kube-api-access-v5m8l\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jh9ts\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.293214 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:50:20 crc kubenswrapper[4878]: I1202 18:50:20.973978 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts"] Dec 02 18:50:21 crc kubenswrapper[4878]: I1202 18:50:21.878479 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" event={"ID":"0112601b-e2a2-4547-bea0-5afad959f726","Type":"ContainerStarted","Data":"f8964fde3acde67375f80e894f3b2b1866335fe8ffd76da42eea286a6b3b7a92"} Dec 02 18:50:21 crc kubenswrapper[4878]: I1202 18:50:21.878810 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" event={"ID":"0112601b-e2a2-4547-bea0-5afad959f726","Type":"ContainerStarted","Data":"1d1a8e4666df77578d04d5f20a8b7d4f404845717542c49e2baff339563dbfd2"} Dec 02 18:50:21 crc kubenswrapper[4878]: I1202 18:50:21.907436 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" podStartSLOduration=2.453435509 podStartE2EDuration="2.907407996s" podCreationTimestamp="2025-12-02 18:50:19 +0000 UTC" firstStartedPulling="2025-12-02 18:50:20.957054301 +0000 UTC m=+2130.646673182" lastFinishedPulling="2025-12-02 18:50:21.411026768 +0000 UTC m=+2131.100645669" observedRunningTime="2025-12-02 18:50:21.898326814 +0000 UTC m=+2131.587945705" watchObservedRunningTime="2025-12-02 18:50:21.907407996 +0000 UTC m=+2131.597026897" Dec 02 18:50:23 crc kubenswrapper[4878]: I1202 18:50:23.742220 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:50:23 crc kubenswrapper[4878]: I1202 18:50:23.742559 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:50:29 crc kubenswrapper[4878]: I1202 18:50:29.080121 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w76q8"] Dec 02 18:50:29 crc kubenswrapper[4878]: I1202 18:50:29.093527 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w76q8"] Dec 02 18:50:30 crc kubenswrapper[4878]: I1202 18:50:30.040308 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rj842"] Dec 02 18:50:30 crc kubenswrapper[4878]: I1202 18:50:30.057164 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rj842"] Dec 02 18:50:30 crc kubenswrapper[4878]: I1202 18:50:30.962210 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda2e4cb-b62a-40c0-a3d8-5a427b609472" path="/var/lib/kubelet/pods/dda2e4cb-b62a-40c0-a3d8-5a427b609472/volumes" Dec 02 18:50:30 crc kubenswrapper[4878]: I1202 18:50:30.965872 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1569008-ec65-4f71-bde2-2ee2ea8c2e7a" path="/var/lib/kubelet/pods/e1569008-ec65-4f71-bde2-2ee2ea8c2e7a/volumes" Dec 02 18:50:53 crc kubenswrapper[4878]: I1202 18:50:53.741959 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:50:53 crc kubenswrapper[4878]: I1202 18:50:53.742788 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:51:03 crc kubenswrapper[4878]: I1202 18:51:03.688788 4878 scope.go:117] "RemoveContainer" containerID="9b5a8a7e2fd581e2c6f7082289e64b5fdc9d48505b6ef1a828d6d42788b56404" Dec 02 18:51:03 crc kubenswrapper[4878]: I1202 18:51:03.754175 4878 scope.go:117] "RemoveContainer" containerID="c71c2f290ea3c39e21bca870922144c336e0f472d32c341c42049e2f5691e7ec" Dec 02 18:51:04 crc kubenswrapper[4878]: I1202 18:51:04.499739 4878 generic.go:334] "Generic (PLEG): container finished" podID="0112601b-e2a2-4547-bea0-5afad959f726" containerID="f8964fde3acde67375f80e894f3b2b1866335fe8ffd76da42eea286a6b3b7a92" exitCode=0 Dec 02 18:51:04 crc kubenswrapper[4878]: I1202 18:51:04.499839 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" event={"ID":"0112601b-e2a2-4547-bea0-5afad959f726","Type":"ContainerDied","Data":"f8964fde3acde67375f80e894f3b2b1866335fe8ffd76da42eea286a6b3b7a92"} Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.108799 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.198972 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5m8l\" (UniqueName: \"kubernetes.io/projected/0112601b-e2a2-4547-bea0-5afad959f726-kube-api-access-v5m8l\") pod \"0112601b-e2a2-4547-bea0-5afad959f726\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.200176 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-ssh-key\") pod \"0112601b-e2a2-4547-bea0-5afad959f726\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.200509 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-inventory\") pod \"0112601b-e2a2-4547-bea0-5afad959f726\" (UID: \"0112601b-e2a2-4547-bea0-5afad959f726\") " Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.206825 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0112601b-e2a2-4547-bea0-5afad959f726-kube-api-access-v5m8l" (OuterVolumeSpecName: "kube-api-access-v5m8l") pod "0112601b-e2a2-4547-bea0-5afad959f726" (UID: "0112601b-e2a2-4547-bea0-5afad959f726"). InnerVolumeSpecName "kube-api-access-v5m8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.236143 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-inventory" (OuterVolumeSpecName: "inventory") pod "0112601b-e2a2-4547-bea0-5afad959f726" (UID: "0112601b-e2a2-4547-bea0-5afad959f726"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.261416 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0112601b-e2a2-4547-bea0-5afad959f726" (UID: "0112601b-e2a2-4547-bea0-5afad959f726"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.309857 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5m8l\" (UniqueName: \"kubernetes.io/projected/0112601b-e2a2-4547-bea0-5afad959f726-kube-api-access-v5m8l\") on node \"crc\" DevicePath \"\"" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.310079 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.310264 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0112601b-e2a2-4547-bea0-5afad959f726-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.534751 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" event={"ID":"0112601b-e2a2-4547-bea0-5afad959f726","Type":"ContainerDied","Data":"1d1a8e4666df77578d04d5f20a8b7d4f404845717542c49e2baff339563dbfd2"} Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.534803 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d1a8e4666df77578d04d5f20a8b7d4f404845717542c49e2baff339563dbfd2" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.534904 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jh9ts" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.659021 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94"] Dec 02 18:51:06 crc kubenswrapper[4878]: E1202 18:51:06.660667 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0112601b-e2a2-4547-bea0-5afad959f726" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.660728 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0112601b-e2a2-4547-bea0-5afad959f726" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.661418 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="0112601b-e2a2-4547-bea0-5afad959f726" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.663407 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.666639 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.666675 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.667046 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.667263 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.677549 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94"] Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.827224 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.827305 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.827622 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwcl\" (UniqueName: \"kubernetes.io/projected/8714a5c4-b7c2-4e8a-a112-2530648da63b-kube-api-access-tdwcl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.929948 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwcl\" (UniqueName: \"kubernetes.io/projected/8714a5c4-b7c2-4e8a-a112-2530648da63b-kube-api-access-tdwcl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.930017 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.930047 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.934985 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.936144 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.946758 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwcl\" (UniqueName: \"kubernetes.io/projected/8714a5c4-b7c2-4e8a-a112-2530648da63b-kube-api-access-tdwcl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-t7g94\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:06 crc kubenswrapper[4878]: I1202 18:51:06.985464 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:51:07 crc kubenswrapper[4878]: I1202 18:51:07.617986 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94"] Dec 02 18:51:07 crc kubenswrapper[4878]: W1202 18:51:07.621642 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8714a5c4_b7c2_4e8a_a112_2530648da63b.slice/crio-c53f01060752bb4c4756fff475f36c91702200ff6b935846c820ca3f3fdb650f WatchSource:0}: Error finding container c53f01060752bb4c4756fff475f36c91702200ff6b935846c820ca3f3fdb650f: Status 404 returned error can't find the container with id c53f01060752bb4c4756fff475f36c91702200ff6b935846c820ca3f3fdb650f Dec 02 18:51:08 crc kubenswrapper[4878]: I1202 18:51:08.560108 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" event={"ID":"8714a5c4-b7c2-4e8a-a112-2530648da63b","Type":"ContainerStarted","Data":"054a01b505e854d4a0fac67bef3e808e51f50632ec2fd1329b42c3bab7556096"} Dec 02 18:51:08 crc kubenswrapper[4878]: I1202 18:51:08.560530 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" event={"ID":"8714a5c4-b7c2-4e8a-a112-2530648da63b","Type":"ContainerStarted","Data":"c53f01060752bb4c4756fff475f36c91702200ff6b935846c820ca3f3fdb650f"} Dec 02 18:51:08 crc kubenswrapper[4878]: I1202 18:51:08.578074 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" podStartSLOduration=2.10581417 podStartE2EDuration="2.578049835s" podCreationTimestamp="2025-12-02 18:51:06 +0000 UTC" firstStartedPulling="2025-12-02 18:51:07.626359198 +0000 UTC m=+2177.315978089" lastFinishedPulling="2025-12-02 18:51:08.098594843 +0000 UTC m=+2177.788213754" observedRunningTime="2025-12-02 18:51:08.577033313 +0000 UTC m=+2178.266652204" watchObservedRunningTime="2025-12-02 18:51:08.578049835 +0000 UTC m=+2178.267668726" Dec 02 18:51:13 crc kubenswrapper[4878]: I1202 18:51:13.055749 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6697v"] Dec 02 18:51:13 crc kubenswrapper[4878]: I1202 18:51:13.072934 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6697v"] Dec 02 18:51:14 crc kubenswrapper[4878]: I1202 18:51:14.954291 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c0caa-65ba-4a70-a14d-067faf81a1fa" path="/var/lib/kubelet/pods/f78c0caa-65ba-4a70-a14d-067faf81a1fa/volumes" Dec 02 18:51:23 crc kubenswrapper[4878]: I1202 18:51:23.742015 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:51:23 crc kubenswrapper[4878]: I1202 18:51:23.744463 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:51:23 crc kubenswrapper[4878]: I1202 18:51:23.744588 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:51:23 crc kubenswrapper[4878]: I1202 18:51:23.746195 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5975f864ead5b21b2dbb178e245c083aa3ed8f628605531725f83f26b592f531"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:51:23 crc kubenswrapper[4878]: I1202 18:51:23.746367 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://5975f864ead5b21b2dbb178e245c083aa3ed8f628605531725f83f26b592f531" gracePeriod=600 Dec 02 18:51:24 crc kubenswrapper[4878]: I1202 18:51:24.815803 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="5975f864ead5b21b2dbb178e245c083aa3ed8f628605531725f83f26b592f531" exitCode=0 Dec 02 18:51:24 crc kubenswrapper[4878]: I1202 18:51:24.815874 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"5975f864ead5b21b2dbb178e245c083aa3ed8f628605531725f83f26b592f531"} Dec 02 18:51:24 crc kubenswrapper[4878]: I1202 18:51:24.816590 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a"} Dec 02 18:51:24 crc kubenswrapper[4878]: I1202 18:51:24.816623 4878 scope.go:117] "RemoveContainer" containerID="7a789fffafceb6b9f0c2d03d5d24631ab3ac41b71ef1586689be3e1e720fdc1f" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.760183 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6kvd2"] Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.763881 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.772954 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6kvd2"] Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.871622 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-utilities\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.871720 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfb42\" (UniqueName: \"kubernetes.io/projected/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-kube-api-access-gfb42\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.872378 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-catalog-content\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.974737 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-utilities\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.974827 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfb42\" (UniqueName: \"kubernetes.io/projected/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-kube-api-access-gfb42\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.974965 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-catalog-content\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.975251 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-utilities\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.975449 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-catalog-content\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:43 crc kubenswrapper[4878]: I1202 18:51:43.996020 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfb42\" (UniqueName: \"kubernetes.io/projected/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-kube-api-access-gfb42\") pod \"community-operators-6kvd2\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:44 crc kubenswrapper[4878]: I1202 18:51:44.102582 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:44 crc kubenswrapper[4878]: W1202 18:51:44.678898 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90723afd_9aa0_4e3a_81dd_91aa0b34bc19.slice/crio-3c2c0f7c88cc28c9519296eea6003a933254d63c74cbcd330135f8fe3c1cff3b WatchSource:0}: Error finding container 3c2c0f7c88cc28c9519296eea6003a933254d63c74cbcd330135f8fe3c1cff3b: Status 404 returned error can't find the container with id 3c2c0f7c88cc28c9519296eea6003a933254d63c74cbcd330135f8fe3c1cff3b Dec 02 18:51:44 crc kubenswrapper[4878]: I1202 18:51:44.682920 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6kvd2"] Dec 02 18:51:45 crc kubenswrapper[4878]: I1202 18:51:45.114296 4878 generic.go:334] "Generic (PLEG): container finished" podID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerID="e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2" exitCode=0 Dec 02 18:51:45 crc kubenswrapper[4878]: I1202 18:51:45.114390 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kvd2" event={"ID":"90723afd-9aa0-4e3a-81dd-91aa0b34bc19","Type":"ContainerDied","Data":"e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2"} Dec 02 18:51:45 crc kubenswrapper[4878]: I1202 18:51:45.114654 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kvd2" event={"ID":"90723afd-9aa0-4e3a-81dd-91aa0b34bc19","Type":"ContainerStarted","Data":"3c2c0f7c88cc28c9519296eea6003a933254d63c74cbcd330135f8fe3c1cff3b"} Dec 02 18:51:46 crc kubenswrapper[4878]: I1202 18:51:46.130634 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kvd2" event={"ID":"90723afd-9aa0-4e3a-81dd-91aa0b34bc19","Type":"ContainerStarted","Data":"2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66"} Dec 02 18:51:48 crc kubenswrapper[4878]: I1202 18:51:48.164963 4878 generic.go:334] "Generic (PLEG): container finished" podID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerID="2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66" exitCode=0 Dec 02 18:51:48 crc kubenswrapper[4878]: I1202 18:51:48.165043 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kvd2" event={"ID":"90723afd-9aa0-4e3a-81dd-91aa0b34bc19","Type":"ContainerDied","Data":"2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66"} Dec 02 18:51:49 crc kubenswrapper[4878]: I1202 18:51:49.181463 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kvd2" event={"ID":"90723afd-9aa0-4e3a-81dd-91aa0b34bc19","Type":"ContainerStarted","Data":"46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c"} Dec 02 18:51:49 crc kubenswrapper[4878]: I1202 18:51:49.215191 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6kvd2" podStartSLOduration=2.737588953 podStartE2EDuration="6.215162405s" podCreationTimestamp="2025-12-02 18:51:43 +0000 UTC" firstStartedPulling="2025-12-02 18:51:45.115785154 +0000 UTC m=+2214.805404035" lastFinishedPulling="2025-12-02 18:51:48.593358606 +0000 UTC m=+2218.282977487" observedRunningTime="2025-12-02 18:51:49.205601149 +0000 UTC m=+2218.895220020" watchObservedRunningTime="2025-12-02 18:51:49.215162405 +0000 UTC m=+2218.904781326" Dec 02 18:51:54 crc kubenswrapper[4878]: I1202 18:51:54.103390 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:54 crc kubenswrapper[4878]: I1202 18:51:54.104069 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:54 crc kubenswrapper[4878]: I1202 18:51:54.204094 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:54 crc kubenswrapper[4878]: I1202 18:51:54.312442 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:54 crc kubenswrapper[4878]: I1202 18:51:54.443414 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6kvd2"] Dec 02 18:51:56 crc kubenswrapper[4878]: I1202 18:51:56.282709 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6kvd2" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="registry-server" containerID="cri-o://46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c" gracePeriod=2 Dec 02 18:51:56 crc kubenswrapper[4878]: I1202 18:51:56.995616 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.167057 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-catalog-content\") pod \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.167175 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-utilities\") pod \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.167213 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfb42\" (UniqueName: \"kubernetes.io/projected/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-kube-api-access-gfb42\") pod \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\" (UID: \"90723afd-9aa0-4e3a-81dd-91aa0b34bc19\") " Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.169041 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-utilities" (OuterVolumeSpecName: "utilities") pod "90723afd-9aa0-4e3a-81dd-91aa0b34bc19" (UID: "90723afd-9aa0-4e3a-81dd-91aa0b34bc19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.179054 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-kube-api-access-gfb42" (OuterVolumeSpecName: "kube-api-access-gfb42") pod "90723afd-9aa0-4e3a-81dd-91aa0b34bc19" (UID: "90723afd-9aa0-4e3a-81dd-91aa0b34bc19"). InnerVolumeSpecName "kube-api-access-gfb42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.222690 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90723afd-9aa0-4e3a-81dd-91aa0b34bc19" (UID: "90723afd-9aa0-4e3a-81dd-91aa0b34bc19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.269952 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.269995 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.270008 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfb42\" (UniqueName: \"kubernetes.io/projected/90723afd-9aa0-4e3a-81dd-91aa0b34bc19-kube-api-access-gfb42\") on node \"crc\" DevicePath \"\"" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.301726 4878 generic.go:334] "Generic (PLEG): container finished" podID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerID="46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c" exitCode=0 Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.301772 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kvd2" event={"ID":"90723afd-9aa0-4e3a-81dd-91aa0b34bc19","Type":"ContainerDied","Data":"46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c"} Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.301799 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kvd2" event={"ID":"90723afd-9aa0-4e3a-81dd-91aa0b34bc19","Type":"ContainerDied","Data":"3c2c0f7c88cc28c9519296eea6003a933254d63c74cbcd330135f8fe3c1cff3b"} Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.301821 4878 scope.go:117] "RemoveContainer" containerID="46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.301972 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kvd2" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.348458 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6kvd2"] Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.357200 4878 scope.go:117] "RemoveContainer" containerID="2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.358556 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6kvd2"] Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.393879 4878 scope.go:117] "RemoveContainer" containerID="e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.464980 4878 scope.go:117] "RemoveContainer" containerID="46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c" Dec 02 18:51:57 crc kubenswrapper[4878]: E1202 18:51:57.465743 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c\": container with ID starting with 46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c not found: ID does not exist" containerID="46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.465790 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c"} err="failed to get container status \"46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c\": rpc error: code = NotFound desc = could not find container \"46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c\": container with ID starting with 46b6ed6000bf4a682a0900e1d2edec9388802ccf024ac0f8edb0c2a891645b2c not found: ID does not exist" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.465817 4878 scope.go:117] "RemoveContainer" containerID="2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66" Dec 02 18:51:57 crc kubenswrapper[4878]: E1202 18:51:57.466220 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66\": container with ID starting with 2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66 not found: ID does not exist" containerID="2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.466269 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66"} err="failed to get container status \"2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66\": rpc error: code = NotFound desc = could not find container \"2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66\": container with ID starting with 2fc6adf1625dec7a5d8fbbf1ccc957d58e7f228396d125b038f2a09fa0f4ca66 not found: ID does not exist" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.466288 4878 scope.go:117] "RemoveContainer" containerID="e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2" Dec 02 18:51:57 crc kubenswrapper[4878]: E1202 18:51:57.466620 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2\": container with ID starting with e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2 not found: ID does not exist" containerID="e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2" Dec 02 18:51:57 crc kubenswrapper[4878]: I1202 18:51:57.466692 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2"} err="failed to get container status \"e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2\": rpc error: code = NotFound desc = could not find container \"e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2\": container with ID starting with e72f845c8e339b15a9cbdf670fef8d9fcf0f65f09012f4b392afb214025775d2 not found: ID does not exist" Dec 02 18:51:58 crc kubenswrapper[4878]: I1202 18:51:58.957351 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" path="/var/lib/kubelet/pods/90723afd-9aa0-4e3a-81dd-91aa0b34bc19/volumes" Dec 02 18:52:03 crc kubenswrapper[4878]: I1202 18:52:03.870301 4878 scope.go:117] "RemoveContainer" containerID="f07e4edfb83891c4e9b72db710b90e724310d12f096e9b918491ad8dbbcc45a2" Dec 02 18:52:06 crc kubenswrapper[4878]: I1202 18:52:06.425917 4878 generic.go:334] "Generic (PLEG): container finished" podID="8714a5c4-b7c2-4e8a-a112-2530648da63b" containerID="054a01b505e854d4a0fac67bef3e808e51f50632ec2fd1329b42c3bab7556096" exitCode=0 Dec 02 18:52:06 crc kubenswrapper[4878]: I1202 18:52:06.425982 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" event={"ID":"8714a5c4-b7c2-4e8a-a112-2530648da63b","Type":"ContainerDied","Data":"054a01b505e854d4a0fac67bef3e808e51f50632ec2fd1329b42c3bab7556096"} Dec 02 18:52:07 crc kubenswrapper[4878]: I1202 18:52:07.975506 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.090794 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-ssh-key\") pod \"8714a5c4-b7c2-4e8a-a112-2530648da63b\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.090871 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-inventory\") pod \"8714a5c4-b7c2-4e8a-a112-2530648da63b\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.091224 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwcl\" (UniqueName: \"kubernetes.io/projected/8714a5c4-b7c2-4e8a-a112-2530648da63b-kube-api-access-tdwcl\") pod \"8714a5c4-b7c2-4e8a-a112-2530648da63b\" (UID: \"8714a5c4-b7c2-4e8a-a112-2530648da63b\") " Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.101322 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8714a5c4-b7c2-4e8a-a112-2530648da63b-kube-api-access-tdwcl" (OuterVolumeSpecName: "kube-api-access-tdwcl") pod "8714a5c4-b7c2-4e8a-a112-2530648da63b" (UID: "8714a5c4-b7c2-4e8a-a112-2530648da63b"). InnerVolumeSpecName "kube-api-access-tdwcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.124923 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8714a5c4-b7c2-4e8a-a112-2530648da63b" (UID: "8714a5c4-b7c2-4e8a-a112-2530648da63b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.156168 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-inventory" (OuterVolumeSpecName: "inventory") pod "8714a5c4-b7c2-4e8a-a112-2530648da63b" (UID: "8714a5c4-b7c2-4e8a-a112-2530648da63b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.197797 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwcl\" (UniqueName: \"kubernetes.io/projected/8714a5c4-b7c2-4e8a-a112-2530648da63b-kube-api-access-tdwcl\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.198835 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.198931 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8714a5c4-b7c2-4e8a-a112-2530648da63b-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.461184 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" event={"ID":"8714a5c4-b7c2-4e8a-a112-2530648da63b","Type":"ContainerDied","Data":"c53f01060752bb4c4756fff475f36c91702200ff6b935846c820ca3f3fdb650f"} Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.461266 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c53f01060752bb4c4756fff475f36c91702200ff6b935846c820ca3f3fdb650f" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.461355 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-t7g94" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.603814 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t4tn6"] Dec 02 18:52:08 crc kubenswrapper[4878]: E1202 18:52:08.604514 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="extract-utilities" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.604537 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="extract-utilities" Dec 02 18:52:08 crc kubenswrapper[4878]: E1202 18:52:08.604559 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="registry-server" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.604568 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="registry-server" Dec 02 18:52:08 crc kubenswrapper[4878]: E1202 18:52:08.604584 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="extract-content" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.604596 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="extract-content" Dec 02 18:52:08 crc kubenswrapper[4878]: E1202 18:52:08.604621 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8714a5c4-b7c2-4e8a-a112-2530648da63b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.604632 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8714a5c4-b7c2-4e8a-a112-2530648da63b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.604952 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8714a5c4-b7c2-4e8a-a112-2530648da63b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.604977 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="90723afd-9aa0-4e3a-81dd-91aa0b34bc19" containerName="registry-server" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.606056 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.609955 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.610098 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.610532 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.610614 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.612108 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.612214 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.612855 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8n46\" (UniqueName: \"kubernetes.io/projected/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-kube-api-access-w8n46\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.623175 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t4tn6"] Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.716359 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8n46\" (UniqueName: \"kubernetes.io/projected/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-kube-api-access-w8n46\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.717212 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.717316 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.723278 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.723775 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.736991 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8n46\" (UniqueName: \"kubernetes.io/projected/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-kube-api-access-w8n46\") pod \"ssh-known-hosts-edpm-deployment-t4tn6\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:08 crc kubenswrapper[4878]: I1202 18:52:08.948931 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:09 crc kubenswrapper[4878]: I1202 18:52:09.603809 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t4tn6"] Dec 02 18:52:09 crc kubenswrapper[4878]: W1202 18:52:09.612102 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b071f6b_2ee2_41ea_ab08_219cf6ecfc0b.slice/crio-c1de1d9bd119f708bb7b52e6256b86f607dab178e2b68c625d43a02e50510ad1 WatchSource:0}: Error finding container c1de1d9bd119f708bb7b52e6256b86f607dab178e2b68c625d43a02e50510ad1: Status 404 returned error can't find the container with id c1de1d9bd119f708bb7b52e6256b86f607dab178e2b68c625d43a02e50510ad1 Dec 02 18:52:10 crc kubenswrapper[4878]: I1202 18:52:10.484553 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" event={"ID":"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b","Type":"ContainerStarted","Data":"61f152d11f50adc133506be653674158ee01ee3a0bbd37aa63ed8d258182e85c"} Dec 02 18:52:10 crc kubenswrapper[4878]: I1202 18:52:10.485158 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" event={"ID":"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b","Type":"ContainerStarted","Data":"c1de1d9bd119f708bb7b52e6256b86f607dab178e2b68c625d43a02e50510ad1"} Dec 02 18:52:10 crc kubenswrapper[4878]: I1202 18:52:10.507681 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" podStartSLOduration=2.07711341 podStartE2EDuration="2.507640701s" podCreationTimestamp="2025-12-02 18:52:08 +0000 UTC" firstStartedPulling="2025-12-02 18:52:09.6159779 +0000 UTC m=+2239.305596791" lastFinishedPulling="2025-12-02 18:52:10.046505191 +0000 UTC m=+2239.736124082" observedRunningTime="2025-12-02 18:52:10.504088191 +0000 UTC m=+2240.193707072" watchObservedRunningTime="2025-12-02 18:52:10.507640701 +0000 UTC m=+2240.197259582" Dec 02 18:52:18 crc kubenswrapper[4878]: I1202 18:52:18.593506 4878 generic.go:334] "Generic (PLEG): container finished" podID="8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b" containerID="61f152d11f50adc133506be653674158ee01ee3a0bbd37aa63ed8d258182e85c" exitCode=0 Dec 02 18:52:18 crc kubenswrapper[4878]: I1202 18:52:18.593653 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" event={"ID":"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b","Type":"ContainerDied","Data":"61f152d11f50adc133506be653674158ee01ee3a0bbd37aa63ed8d258182e85c"} Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.207473 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.279928 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8n46\" (UniqueName: \"kubernetes.io/projected/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-kube-api-access-w8n46\") pod \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.280008 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-inventory-0\") pod \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.280127 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-ssh-key-openstack-edpm-ipam\") pod \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\" (UID: \"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b\") " Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.285439 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-kube-api-access-w8n46" (OuterVolumeSpecName: "kube-api-access-w8n46") pod "8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b" (UID: "8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b"). InnerVolumeSpecName "kube-api-access-w8n46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.313404 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b" (UID: "8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.317057 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b" (UID: "8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.383514 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8n46\" (UniqueName: \"kubernetes.io/projected/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-kube-api-access-w8n46\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.383755 4878 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.383989 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.629663 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" event={"ID":"8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b","Type":"ContainerDied","Data":"c1de1d9bd119f708bb7b52e6256b86f607dab178e2b68c625d43a02e50510ad1"} Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.629725 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1de1d9bd119f708bb7b52e6256b86f607dab178e2b68c625d43a02e50510ad1" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.629762 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t4tn6" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.721384 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2"] Dec 02 18:52:20 crc kubenswrapper[4878]: E1202 18:52:20.721966 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b" containerName="ssh-known-hosts-edpm-deployment" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.721984 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b" containerName="ssh-known-hosts-edpm-deployment" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.722260 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b" containerName="ssh-known-hosts-edpm-deployment" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.723141 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.726993 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.727086 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.727196 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.727486 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.737859 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2"] Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.796569 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.796623 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.796731 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdxf\" (UniqueName: \"kubernetes.io/projected/e32e5051-b0ff-4fee-9268-266c4cc38c68-kube-api-access-mrdxf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.900918 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.901020 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.901082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdxf\" (UniqueName: \"kubernetes.io/projected/e32e5051-b0ff-4fee-9268-266c4cc38c68-kube-api-access-mrdxf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.907306 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.907809 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:20 crc kubenswrapper[4878]: I1202 18:52:20.926140 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdxf\" (UniqueName: \"kubernetes.io/projected/e32e5051-b0ff-4fee-9268-266c4cc38c68-kube-api-access-mrdxf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zstf2\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:21 crc kubenswrapper[4878]: I1202 18:52:21.061151 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:21 crc kubenswrapper[4878]: I1202 18:52:21.666884 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2"] Dec 02 18:52:22 crc kubenswrapper[4878]: I1202 18:52:22.653147 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" event={"ID":"e32e5051-b0ff-4fee-9268-266c4cc38c68","Type":"ContainerStarted","Data":"30138e51771fcf560e54e05d83d4002a151c98be62b0ec684b83e402fd1b9540"} Dec 02 18:52:22 crc kubenswrapper[4878]: I1202 18:52:22.653770 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" event={"ID":"e32e5051-b0ff-4fee-9268-266c4cc38c68","Type":"ContainerStarted","Data":"beb85081b08de4eb598c18d114994e7ed2aae8cde1086aade8a322dd7a26b863"} Dec 02 18:52:22 crc kubenswrapper[4878]: I1202 18:52:22.678288 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" podStartSLOduration=1.9875041260000001 podStartE2EDuration="2.678269611s" podCreationTimestamp="2025-12-02 18:52:20 +0000 UTC" firstStartedPulling="2025-12-02 18:52:21.670668788 +0000 UTC m=+2251.360287669" lastFinishedPulling="2025-12-02 18:52:22.361434273 +0000 UTC m=+2252.051053154" observedRunningTime="2025-12-02 18:52:22.674494905 +0000 UTC m=+2252.364113796" watchObservedRunningTime="2025-12-02 18:52:22.678269611 +0000 UTC m=+2252.367888502" Dec 02 18:52:31 crc kubenswrapper[4878]: I1202 18:52:31.791331 4878 generic.go:334] "Generic (PLEG): container finished" podID="e32e5051-b0ff-4fee-9268-266c4cc38c68" containerID="30138e51771fcf560e54e05d83d4002a151c98be62b0ec684b83e402fd1b9540" exitCode=0 Dec 02 18:52:31 crc kubenswrapper[4878]: I1202 18:52:31.791828 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" event={"ID":"e32e5051-b0ff-4fee-9268-266c4cc38c68","Type":"ContainerDied","Data":"30138e51771fcf560e54e05d83d4002a151c98be62b0ec684b83e402fd1b9540"} Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.425623 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.594782 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdxf\" (UniqueName: \"kubernetes.io/projected/e32e5051-b0ff-4fee-9268-266c4cc38c68-kube-api-access-mrdxf\") pod \"e32e5051-b0ff-4fee-9268-266c4cc38c68\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.594977 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-ssh-key\") pod \"e32e5051-b0ff-4fee-9268-266c4cc38c68\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.595007 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-inventory\") pod \"e32e5051-b0ff-4fee-9268-266c4cc38c68\" (UID: \"e32e5051-b0ff-4fee-9268-266c4cc38c68\") " Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.604780 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32e5051-b0ff-4fee-9268-266c4cc38c68-kube-api-access-mrdxf" (OuterVolumeSpecName: "kube-api-access-mrdxf") pod "e32e5051-b0ff-4fee-9268-266c4cc38c68" (UID: "e32e5051-b0ff-4fee-9268-266c4cc38c68"). InnerVolumeSpecName "kube-api-access-mrdxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.655532 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e32e5051-b0ff-4fee-9268-266c4cc38c68" (UID: "e32e5051-b0ff-4fee-9268-266c4cc38c68"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.663800 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-inventory" (OuterVolumeSpecName: "inventory") pod "e32e5051-b0ff-4fee-9268-266c4cc38c68" (UID: "e32e5051-b0ff-4fee-9268-266c4cc38c68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.702046 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrdxf\" (UniqueName: \"kubernetes.io/projected/e32e5051-b0ff-4fee-9268-266c4cc38c68-kube-api-access-mrdxf\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.702113 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.702139 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e32e5051-b0ff-4fee-9268-266c4cc38c68-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.825766 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" event={"ID":"e32e5051-b0ff-4fee-9268-266c4cc38c68","Type":"ContainerDied","Data":"beb85081b08de4eb598c18d114994e7ed2aae8cde1086aade8a322dd7a26b863"} Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.825810 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zstf2" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.825822 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beb85081b08de4eb598c18d114994e7ed2aae8cde1086aade8a322dd7a26b863" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.940905 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n"] Dec 02 18:52:33 crc kubenswrapper[4878]: E1202 18:52:33.941437 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32e5051-b0ff-4fee-9268-266c4cc38c68" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.941452 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32e5051-b0ff-4fee-9268-266c4cc38c68" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.941660 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e32e5051-b0ff-4fee-9268-266c4cc38c68" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.942481 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.946002 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.946397 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.946748 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.947164 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:52:33 crc kubenswrapper[4878]: I1202 18:52:33.974542 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n"] Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.009967 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.010168 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86j26\" (UniqueName: \"kubernetes.io/projected/f1a288af-a20b-4e48-a331-561e16e01989-kube-api-access-86j26\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.010201 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.111868 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86j26\" (UniqueName: \"kubernetes.io/projected/f1a288af-a20b-4e48-a331-561e16e01989-kube-api-access-86j26\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.111920 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.112052 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.116546 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.117475 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.144510 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86j26\" (UniqueName: \"kubernetes.io/projected/f1a288af-a20b-4e48-a331-561e16e01989-kube-api-access-86j26\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.290403 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:34 crc kubenswrapper[4878]: I1202 18:52:34.914063 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n"] Dec 02 18:52:35 crc kubenswrapper[4878]: I1202 18:52:35.854941 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" event={"ID":"f1a288af-a20b-4e48-a331-561e16e01989","Type":"ContainerStarted","Data":"035b2c6d41abbe189691a8d1a79da5d2a2b5bde21fc07774eab879645fbf95cf"} Dec 02 18:52:35 crc kubenswrapper[4878]: I1202 18:52:35.855611 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" event={"ID":"f1a288af-a20b-4e48-a331-561e16e01989","Type":"ContainerStarted","Data":"b91413d7ca38781e03b993a1d346c700827beae7fe82e5d05bb8a771ce787b54"} Dec 02 18:52:35 crc kubenswrapper[4878]: I1202 18:52:35.895553 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" podStartSLOduration=2.38517489 podStartE2EDuration="2.895526385s" podCreationTimestamp="2025-12-02 18:52:33 +0000 UTC" firstStartedPulling="2025-12-02 18:52:34.915269459 +0000 UTC m=+2264.604888340" lastFinishedPulling="2025-12-02 18:52:35.425620904 +0000 UTC m=+2265.115239835" observedRunningTime="2025-12-02 18:52:35.875831455 +0000 UTC m=+2265.565450386" watchObservedRunningTime="2025-12-02 18:52:35.895526385 +0000 UTC m=+2265.585145286" Dec 02 18:52:39 crc kubenswrapper[4878]: I1202 18:52:39.069864 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-fchr4"] Dec 02 18:52:39 crc kubenswrapper[4878]: I1202 18:52:39.086364 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-fchr4"] Dec 02 18:52:40 crc kubenswrapper[4878]: I1202 18:52:40.958758 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de0b145-008a-4a41-aa97-cb01f30d946f" path="/var/lib/kubelet/pods/8de0b145-008a-4a41-aa97-cb01f30d946f/volumes" Dec 02 18:52:46 crc kubenswrapper[4878]: I1202 18:52:46.000476 4878 generic.go:334] "Generic (PLEG): container finished" podID="f1a288af-a20b-4e48-a331-561e16e01989" containerID="035b2c6d41abbe189691a8d1a79da5d2a2b5bde21fc07774eab879645fbf95cf" exitCode=0 Dec 02 18:52:46 crc kubenswrapper[4878]: I1202 18:52:46.000528 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" event={"ID":"f1a288af-a20b-4e48-a331-561e16e01989","Type":"ContainerDied","Data":"035b2c6d41abbe189691a8d1a79da5d2a2b5bde21fc07774eab879645fbf95cf"} Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.510580 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.697553 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-inventory\") pod \"f1a288af-a20b-4e48-a331-561e16e01989\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.697975 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-ssh-key\") pod \"f1a288af-a20b-4e48-a331-561e16e01989\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.698157 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86j26\" (UniqueName: \"kubernetes.io/projected/f1a288af-a20b-4e48-a331-561e16e01989-kube-api-access-86j26\") pod \"f1a288af-a20b-4e48-a331-561e16e01989\" (UID: \"f1a288af-a20b-4e48-a331-561e16e01989\") " Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.710086 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a288af-a20b-4e48-a331-561e16e01989-kube-api-access-86j26" (OuterVolumeSpecName: "kube-api-access-86j26") pod "f1a288af-a20b-4e48-a331-561e16e01989" (UID: "f1a288af-a20b-4e48-a331-561e16e01989"). InnerVolumeSpecName "kube-api-access-86j26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.755899 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f1a288af-a20b-4e48-a331-561e16e01989" (UID: "f1a288af-a20b-4e48-a331-561e16e01989"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.757417 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-inventory" (OuterVolumeSpecName: "inventory") pod "f1a288af-a20b-4e48-a331-561e16e01989" (UID: "f1a288af-a20b-4e48-a331-561e16e01989"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.801182 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.801221 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86j26\" (UniqueName: \"kubernetes.io/projected/f1a288af-a20b-4e48-a331-561e16e01989-kube-api-access-86j26\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:47 crc kubenswrapper[4878]: I1202 18:52:47.801252 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1a288af-a20b-4e48-a331-561e16e01989-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.022308 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" event={"ID":"f1a288af-a20b-4e48-a331-561e16e01989","Type":"ContainerDied","Data":"b91413d7ca38781e03b993a1d346c700827beae7fe82e5d05bb8a771ce787b54"} Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.022749 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b91413d7ca38781e03b993a1d346c700827beae7fe82e5d05bb8a771ce787b54" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.022393 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.163496 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj"] Dec 02 18:52:48 crc kubenswrapper[4878]: E1202 18:52:48.170668 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a288af-a20b-4e48-a331-561e16e01989" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.170710 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a288af-a20b-4e48-a331-561e16e01989" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.172552 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a288af-a20b-4e48-a331-561e16e01989" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.194958 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.200618 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.201532 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.201753 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.201847 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.201962 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.202004 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.202043 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.202121 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.202500 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.229882 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj"] Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.313980 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314055 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314396 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314634 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314754 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314793 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314914 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314960 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.314999 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.315072 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.315201 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmjv6\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-kube-api-access-nmjv6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.315292 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.315355 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.315405 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.315556 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.316190 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.418882 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.418941 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.418964 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.418992 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.419014 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.419032 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.419053 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.419122 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmjv6\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-kube-api-access-nmjv6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.419142 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.420090 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.420127 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.420153 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.420290 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.420325 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.420345 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.420402 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.426056 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.427741 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.428114 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.428703 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.429651 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.430139 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.430707 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.430730 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.431464 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.433658 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.435339 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.435427 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.435868 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.442800 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.448129 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.454698 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmjv6\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-kube-api-access-nmjv6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:48 crc kubenswrapper[4878]: I1202 18:52:48.533821 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:52:49 crc kubenswrapper[4878]: I1202 18:52:49.186401 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj"] Dec 02 18:52:50 crc kubenswrapper[4878]: I1202 18:52:50.055185 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" event={"ID":"7ca1da86-eacb-4ac4-a155-62de0292cbdf","Type":"ContainerStarted","Data":"4b9fe9e18ee15645279a5e46d48b8c4339676befa9a77718fffb9b4ac7f0e68e"} Dec 02 18:52:50 crc kubenswrapper[4878]: I1202 18:52:50.056308 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" event={"ID":"7ca1da86-eacb-4ac4-a155-62de0292cbdf","Type":"ContainerStarted","Data":"a44f4805f435bc3d2c01bc45ef9d237737bc514b6407c42170102348d8e226d1"} Dec 02 18:52:50 crc kubenswrapper[4878]: I1202 18:52:50.102298 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" podStartSLOduration=1.614190195 podStartE2EDuration="2.10227359s" podCreationTimestamp="2025-12-02 18:52:48 +0000 UTC" firstStartedPulling="2025-12-02 18:52:49.199062652 +0000 UTC m=+2278.888681533" lastFinishedPulling="2025-12-02 18:52:49.687146047 +0000 UTC m=+2279.376764928" observedRunningTime="2025-12-02 18:52:50.086187602 +0000 UTC m=+2279.775806513" watchObservedRunningTime="2025-12-02 18:52:50.10227359 +0000 UTC m=+2279.791892481" Dec 02 18:53:04 crc kubenswrapper[4878]: I1202 18:53:04.010482 4878 scope.go:117] "RemoveContainer" containerID="1cdca8f6f5b8e9f527c3630f2eac732cef57ddfc8d0d2a9d3b0f7d2d9cdfc72d" Dec 02 18:53:23 crc kubenswrapper[4878]: I1202 18:53:23.052984 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-8jpmg"] Dec 02 18:53:23 crc kubenswrapper[4878]: I1202 18:53:23.063175 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-8jpmg"] Dec 02 18:53:24 crc kubenswrapper[4878]: I1202 18:53:24.955535 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e658259f-368d-4571-a236-2c9bd3c3d9c6" path="/var/lib/kubelet/pods/e658259f-368d-4571-a236-2c9bd3c3d9c6/volumes" Dec 02 18:53:37 crc kubenswrapper[4878]: I1202 18:53:37.718277 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" event={"ID":"7ca1da86-eacb-4ac4-a155-62de0292cbdf","Type":"ContainerDied","Data":"4b9fe9e18ee15645279a5e46d48b8c4339676befa9a77718fffb9b4ac7f0e68e"} Dec 02 18:53:37 crc kubenswrapper[4878]: I1202 18:53:37.718221 4878 generic.go:334] "Generic (PLEG): container finished" podID="7ca1da86-eacb-4ac4-a155-62de0292cbdf" containerID="4b9fe9e18ee15645279a5e46d48b8c4339676befa9a77718fffb9b4ac7f0e68e" exitCode=0 Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.207139 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379027 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379173 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-power-monitoring-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379272 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379323 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmjv6\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-kube-api-access-nmjv6\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379364 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-neutron-metadata-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379405 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-nova-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379440 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-bootstrap-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379472 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379518 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-repo-setup-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379565 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379624 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-libvirt-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379662 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379719 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ssh-key\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379824 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ovn-combined-ca-bundle\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379875 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.379912 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-inventory\") pod \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\" (UID: \"7ca1da86-eacb-4ac4-a155-62de0292cbdf\") " Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.385932 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.386547 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.387598 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.388654 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.389155 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-kube-api-access-nmjv6" (OuterVolumeSpecName: "kube-api-access-nmjv6") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "kube-api-access-nmjv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.391449 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.393031 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.393306 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.400484 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.400508 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.400548 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.400679 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.434049 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.439461 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.458722 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.470553 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-inventory" (OuterVolumeSpecName: "inventory") pod "7ca1da86-eacb-4ac4-a155-62de0292cbdf" (UID: "7ca1da86-eacb-4ac4-a155-62de0292cbdf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483207 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483279 4878 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483291 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483303 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmjv6\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-kube-api-access-nmjv6\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483312 4878 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483322 4878 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483331 4878 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483340 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483349 4878 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483358 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483366 4878 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483374 4878 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483382 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483390 4878 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483400 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7ca1da86-eacb-4ac4-a155-62de0292cbdf-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.483408 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ca1da86-eacb-4ac4-a155-62de0292cbdf-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.747518 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" event={"ID":"7ca1da86-eacb-4ac4-a155-62de0292cbdf","Type":"ContainerDied","Data":"a44f4805f435bc3d2c01bc45ef9d237737bc514b6407c42170102348d8e226d1"} Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.747585 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a44f4805f435bc3d2c01bc45ef9d237737bc514b6407c42170102348d8e226d1" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.747590 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.893250 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm"] Dec 02 18:53:39 crc kubenswrapper[4878]: E1202 18:53:39.893690 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca1da86-eacb-4ac4-a155-62de0292cbdf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.893707 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca1da86-eacb-4ac4-a155-62de0292cbdf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.893948 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca1da86-eacb-4ac4-a155-62de0292cbdf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.895032 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.897160 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.897261 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.897816 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.898097 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.902075 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.919099 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm"] Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.995324 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.995381 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.995408 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55c9\" (UniqueName: \"kubernetes.io/projected/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-kube-api-access-k55c9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.995488 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:39 crc kubenswrapper[4878]: I1202 18:53:39.995556 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.098586 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.098700 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.098751 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k55c9\" (UniqueName: \"kubernetes.io/projected/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-kube-api-access-k55c9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.098876 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.099063 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.099764 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.103202 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.107695 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.120632 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55c9\" (UniqueName: \"kubernetes.io/projected/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-kube-api-access-k55c9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.120831 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sm6fm\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:40 crc kubenswrapper[4878]: I1202 18:53:40.217300 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:53:41 crc kubenswrapper[4878]: I1202 18:53:40.614419 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm"] Dec 02 18:53:41 crc kubenswrapper[4878]: I1202 18:53:40.617514 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 18:53:41 crc kubenswrapper[4878]: I1202 18:53:40.762603 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" event={"ID":"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22","Type":"ContainerStarted","Data":"9c1ad6d072bc40fcfe773e0b51a325e7816b28b4628424a533b86c0c95413b69"} Dec 02 18:53:41 crc kubenswrapper[4878]: I1202 18:53:41.781083 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" event={"ID":"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22","Type":"ContainerStarted","Data":"00da0322c78f594eaa647be3883e2aed626ca09b1c7814d78701b6b8b5558fc2"} Dec 02 18:53:41 crc kubenswrapper[4878]: I1202 18:53:41.807757 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" podStartSLOduration=2.351601704 podStartE2EDuration="2.807741998s" podCreationTimestamp="2025-12-02 18:53:39 +0000 UTC" firstStartedPulling="2025-12-02 18:53:40.616888747 +0000 UTC m=+2330.306507638" lastFinishedPulling="2025-12-02 18:53:41.073029041 +0000 UTC m=+2330.762647932" observedRunningTime="2025-12-02 18:53:41.805436197 +0000 UTC m=+2331.495055078" watchObservedRunningTime="2025-12-02 18:53:41.807741998 +0000 UTC m=+2331.497360879" Dec 02 18:53:53 crc kubenswrapper[4878]: I1202 18:53:53.742951 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:53:53 crc kubenswrapper[4878]: I1202 18:53:53.743492 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:54:04 crc kubenswrapper[4878]: I1202 18:54:04.091708 4878 scope.go:117] "RemoveContainer" containerID="94a15ab3becf2b9eded108dfd0e7e355cceb6b5b42271f8118e7a77886aaa08e" Dec 02 18:54:23 crc kubenswrapper[4878]: I1202 18:54:23.742841 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:54:23 crc kubenswrapper[4878]: I1202 18:54:23.743550 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:54:49 crc kubenswrapper[4878]: I1202 18:54:49.644786 4878 generic.go:334] "Generic (PLEG): container finished" podID="49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" containerID="00da0322c78f594eaa647be3883e2aed626ca09b1c7814d78701b6b8b5558fc2" exitCode=0 Dec 02 18:54:49 crc kubenswrapper[4878]: I1202 18:54:49.644864 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" event={"ID":"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22","Type":"ContainerDied","Data":"00da0322c78f594eaa647be3883e2aed626ca09b1c7814d78701b6b8b5558fc2"} Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.254354 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.340707 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ssh-key\") pod \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.340772 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-inventory\") pod \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.340877 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovn-combined-ca-bundle\") pod \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.340909 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovncontroller-config-0\") pod \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.341012 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k55c9\" (UniqueName: \"kubernetes.io/projected/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-kube-api-access-k55c9\") pod \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\" (UID: \"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22\") " Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.351992 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" (UID: "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.355720 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-kube-api-access-k55c9" (OuterVolumeSpecName: "kube-api-access-k55c9") pod "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" (UID: "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22"). InnerVolumeSpecName "kube-api-access-k55c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.381159 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" (UID: "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.383427 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" (UID: "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.389663 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-inventory" (OuterVolumeSpecName: "inventory") pod "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" (UID: "49eb68a8-72ac-4fb7-ab11-7e89f85e7f22"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.445735 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k55c9\" (UniqueName: \"kubernetes.io/projected/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-kube-api-access-k55c9\") on node \"crc\" DevicePath \"\"" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.446083 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.446287 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.446461 4878 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.446608 4878 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/49eb68a8-72ac-4fb7-ab11-7e89f85e7f22-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.670789 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" event={"ID":"49eb68a8-72ac-4fb7-ab11-7e89f85e7f22","Type":"ContainerDied","Data":"9c1ad6d072bc40fcfe773e0b51a325e7816b28b4628424a533b86c0c95413b69"} Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.670865 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1ad6d072bc40fcfe773e0b51a325e7816b28b4628424a533b86c0c95413b69" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.670863 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sm6fm" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.783798 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v"] Dec 02 18:54:51 crc kubenswrapper[4878]: E1202 18:54:51.784422 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.784440 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.784742 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="49eb68a8-72ac-4fb7-ab11-7e89f85e7f22" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.790312 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.794920 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.795192 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.795732 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.795929 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.796015 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.796161 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.802547 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v"] Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.857940 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.858251 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.858365 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmct5\" (UniqueName: \"kubernetes.io/projected/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-kube-api-access-qmct5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.858480 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.858577 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.858625 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.960948 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.961693 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.962115 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmct5\" (UniqueName: \"kubernetes.io/projected/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-kube-api-access-qmct5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.962153 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.962196 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.962258 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.965911 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.967359 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.967629 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.969703 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.972646 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:51 crc kubenswrapper[4878]: I1202 18:54:51.980370 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmct5\" (UniqueName: \"kubernetes.io/projected/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-kube-api-access-qmct5\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:52 crc kubenswrapper[4878]: I1202 18:54:52.111404 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:54:52 crc kubenswrapper[4878]: I1202 18:54:52.867359 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v"] Dec 02 18:54:52 crc kubenswrapper[4878]: W1202 18:54:52.871127 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ba7638_79c8_4806_9bc5_8b8c7ab029c3.slice/crio-24c1ed13a8b02634764ccfe9c41de9b6c51d6640dfe2acfb652043d9841483fb WatchSource:0}: Error finding container 24c1ed13a8b02634764ccfe9c41de9b6c51d6640dfe2acfb652043d9841483fb: Status 404 returned error can't find the container with id 24c1ed13a8b02634764ccfe9c41de9b6c51d6640dfe2acfb652043d9841483fb Dec 02 18:54:53 crc kubenswrapper[4878]: I1202 18:54:53.697595 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" event={"ID":"14ba7638-79c8-4806-9bc5-8b8c7ab029c3","Type":"ContainerStarted","Data":"24c1ed13a8b02634764ccfe9c41de9b6c51d6640dfe2acfb652043d9841483fb"} Dec 02 18:54:53 crc kubenswrapper[4878]: I1202 18:54:53.742503 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 18:54:53 crc kubenswrapper[4878]: I1202 18:54:53.742565 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 18:54:53 crc kubenswrapper[4878]: I1202 18:54:53.742606 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 18:54:53 crc kubenswrapper[4878]: I1202 18:54:53.743463 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 18:54:53 crc kubenswrapper[4878]: I1202 18:54:53.743517 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" gracePeriod=600 Dec 02 18:54:53 crc kubenswrapper[4878]: E1202 18:54:53.898286 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:54:54 crc kubenswrapper[4878]: I1202 18:54:54.712635 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" event={"ID":"14ba7638-79c8-4806-9bc5-8b8c7ab029c3","Type":"ContainerStarted","Data":"537cc97146e24cf337c4da0df83a5b0eec2735fe51aacafedf7795c502034664"} Dec 02 18:54:54 crc kubenswrapper[4878]: I1202 18:54:54.716477 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" exitCode=0 Dec 02 18:54:54 crc kubenswrapper[4878]: I1202 18:54:54.716597 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a"} Dec 02 18:54:54 crc kubenswrapper[4878]: I1202 18:54:54.716721 4878 scope.go:117] "RemoveContainer" containerID="5975f864ead5b21b2dbb178e245c083aa3ed8f628605531725f83f26b592f531" Dec 02 18:54:54 crc kubenswrapper[4878]: I1202 18:54:54.718859 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:54:54 crc kubenswrapper[4878]: E1202 18:54:54.719534 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:54:54 crc kubenswrapper[4878]: I1202 18:54:54.745789 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" podStartSLOduration=3.248033173 podStartE2EDuration="3.745766796s" podCreationTimestamp="2025-12-02 18:54:51 +0000 UTC" firstStartedPulling="2025-12-02 18:54:52.872956872 +0000 UTC m=+2402.562575753" lastFinishedPulling="2025-12-02 18:54:53.370690445 +0000 UTC m=+2403.060309376" observedRunningTime="2025-12-02 18:54:54.744656831 +0000 UTC m=+2404.434275722" watchObservedRunningTime="2025-12-02 18:54:54.745766796 +0000 UTC m=+2404.435385677" Dec 02 18:55:06 crc kubenswrapper[4878]: I1202 18:55:06.939280 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:55:06 crc kubenswrapper[4878]: E1202 18:55:06.940594 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:55:17 crc kubenswrapper[4878]: I1202 18:55:17.938212 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:55:17 crc kubenswrapper[4878]: E1202 18:55:17.939333 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:55:28 crc kubenswrapper[4878]: I1202 18:55:28.939632 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:55:28 crc kubenswrapper[4878]: E1202 18:55:28.940598 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:55:41 crc kubenswrapper[4878]: I1202 18:55:41.938511 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:55:41 crc kubenswrapper[4878]: E1202 18:55:41.939621 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:55:45 crc kubenswrapper[4878]: I1202 18:55:45.476898 4878 generic.go:334] "Generic (PLEG): container finished" podID="14ba7638-79c8-4806-9bc5-8b8c7ab029c3" containerID="537cc97146e24cf337c4da0df83a5b0eec2735fe51aacafedf7795c502034664" exitCode=0 Dec 02 18:55:45 crc kubenswrapper[4878]: I1202 18:55:45.476966 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" event={"ID":"14ba7638-79c8-4806-9bc5-8b8c7ab029c3","Type":"ContainerDied","Data":"537cc97146e24cf337c4da0df83a5b0eec2735fe51aacafedf7795c502034664"} Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.016095 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.088824 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-inventory\") pod \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.088862 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmct5\" (UniqueName: \"kubernetes.io/projected/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-kube-api-access-qmct5\") pod \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.088893 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-ssh-key\") pod \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.088954 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-metadata-combined-ca-bundle\") pod \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.089047 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.089079 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-nova-metadata-neutron-config-0\") pod \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\" (UID: \"14ba7638-79c8-4806-9bc5-8b8c7ab029c3\") " Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.095169 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "14ba7638-79c8-4806-9bc5-8b8c7ab029c3" (UID: "14ba7638-79c8-4806-9bc5-8b8c7ab029c3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.095396 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-kube-api-access-qmct5" (OuterVolumeSpecName: "kube-api-access-qmct5") pod "14ba7638-79c8-4806-9bc5-8b8c7ab029c3" (UID: "14ba7638-79c8-4806-9bc5-8b8c7ab029c3"). InnerVolumeSpecName "kube-api-access-qmct5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.128783 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-inventory" (OuterVolumeSpecName: "inventory") pod "14ba7638-79c8-4806-9bc5-8b8c7ab029c3" (UID: "14ba7638-79c8-4806-9bc5-8b8c7ab029c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.130553 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "14ba7638-79c8-4806-9bc5-8b8c7ab029c3" (UID: "14ba7638-79c8-4806-9bc5-8b8c7ab029c3"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.130891 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14ba7638-79c8-4806-9bc5-8b8c7ab029c3" (UID: "14ba7638-79c8-4806-9bc5-8b8c7ab029c3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.157917 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "14ba7638-79c8-4806-9bc5-8b8c7ab029c3" (UID: "14ba7638-79c8-4806-9bc5-8b8c7ab029c3"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.191372 4878 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.191410 4878 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.191421 4878 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.191432 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.191441 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmct5\" (UniqueName: \"kubernetes.io/projected/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-kube-api-access-qmct5\") on node \"crc\" DevicePath \"\"" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.191450 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14ba7638-79c8-4806-9bc5-8b8c7ab029c3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.508648 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" event={"ID":"14ba7638-79c8-4806-9bc5-8b8c7ab029c3","Type":"ContainerDied","Data":"24c1ed13a8b02634764ccfe9c41de9b6c51d6640dfe2acfb652043d9841483fb"} Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.508691 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c1ed13a8b02634764ccfe9c41de9b6c51d6640dfe2acfb652043d9841483fb" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.508731 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.626708 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5"] Dec 02 18:55:47 crc kubenswrapper[4878]: E1202 18:55:47.627452 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ba7638-79c8-4806-9bc5-8b8c7ab029c3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.627470 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ba7638-79c8-4806-9bc5-8b8c7ab029c3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.627684 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ba7638-79c8-4806-9bc5-8b8c7ab029c3" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.628813 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.631967 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.632042 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.631970 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.632081 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.634162 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.649217 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5"] Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.803112 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.803167 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97hz7\" (UniqueName: \"kubernetes.io/projected/0b199858-7108-4b94-b3f9-692a11430c94-kube-api-access-97hz7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.803207 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.803261 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.803457 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.906055 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.906143 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.906434 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.906466 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97hz7\" (UniqueName: \"kubernetes.io/projected/0b199858-7108-4b94-b3f9-692a11430c94-kube-api-access-97hz7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.906506 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.911034 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.911100 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.911810 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.924887 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97hz7\" (UniqueName: \"kubernetes.io/projected/0b199858-7108-4b94-b3f9-692a11430c94-kube-api-access-97hz7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.927704 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:47 crc kubenswrapper[4878]: I1202 18:55:47.945531 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 18:55:48 crc kubenswrapper[4878]: W1202 18:55:48.674614 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b199858_7108_4b94_b3f9_692a11430c94.slice/crio-fb3f7ed320b4fd3ffbdfdff6e63ddaf41be917312869e3ae8bf1ab4d23a8e079 WatchSource:0}: Error finding container fb3f7ed320b4fd3ffbdfdff6e63ddaf41be917312869e3ae8bf1ab4d23a8e079: Status 404 returned error can't find the container with id fb3f7ed320b4fd3ffbdfdff6e63ddaf41be917312869e3ae8bf1ab4d23a8e079 Dec 02 18:55:48 crc kubenswrapper[4878]: I1202 18:55:48.676645 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5"] Dec 02 18:55:49 crc kubenswrapper[4878]: I1202 18:55:49.544639 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" event={"ID":"0b199858-7108-4b94-b3f9-692a11430c94","Type":"ContainerStarted","Data":"932cced1af30af79035861a46d0910da1970d06a53f549120120dcb15e12843a"} Dec 02 18:55:49 crc kubenswrapper[4878]: I1202 18:55:49.545128 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" event={"ID":"0b199858-7108-4b94-b3f9-692a11430c94","Type":"ContainerStarted","Data":"fb3f7ed320b4fd3ffbdfdff6e63ddaf41be917312869e3ae8bf1ab4d23a8e079"} Dec 02 18:55:49 crc kubenswrapper[4878]: I1202 18:55:49.566045 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" podStartSLOduration=2.148779104 podStartE2EDuration="2.566024964s" podCreationTimestamp="2025-12-02 18:55:47 +0000 UTC" firstStartedPulling="2025-12-02 18:55:48.677821291 +0000 UTC m=+2458.367440212" lastFinishedPulling="2025-12-02 18:55:49.095067191 +0000 UTC m=+2458.784686072" observedRunningTime="2025-12-02 18:55:49.558374317 +0000 UTC m=+2459.247993198" watchObservedRunningTime="2025-12-02 18:55:49.566024964 +0000 UTC m=+2459.255643845" Dec 02 18:55:52 crc kubenswrapper[4878]: I1202 18:55:52.938146 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:55:52 crc kubenswrapper[4878]: E1202 18:55:52.938966 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:56:07 crc kubenswrapper[4878]: I1202 18:56:07.937618 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:56:07 crc kubenswrapper[4878]: E1202 18:56:07.938536 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:56:21 crc kubenswrapper[4878]: I1202 18:56:21.939758 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:56:21 crc kubenswrapper[4878]: E1202 18:56:21.940718 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:56:35 crc kubenswrapper[4878]: I1202 18:56:35.938889 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:56:35 crc kubenswrapper[4878]: E1202 18:56:35.939893 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:56:47 crc kubenswrapper[4878]: I1202 18:56:47.938860 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:56:47 crc kubenswrapper[4878]: E1202 18:56:47.940991 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:56:59 crc kubenswrapper[4878]: I1202 18:56:59.939349 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:56:59 crc kubenswrapper[4878]: E1202 18:56:59.940260 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:57:12 crc kubenswrapper[4878]: I1202 18:57:12.938884 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:57:12 crc kubenswrapper[4878]: E1202 18:57:12.941773 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:57:23 crc kubenswrapper[4878]: I1202 18:57:23.939579 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:57:23 crc kubenswrapper[4878]: E1202 18:57:23.940662 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:57:35 crc kubenswrapper[4878]: I1202 18:57:35.938212 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:57:35 crc kubenswrapper[4878]: E1202 18:57:35.938927 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:57:47 crc kubenswrapper[4878]: I1202 18:57:47.939699 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:57:47 crc kubenswrapper[4878]: E1202 18:57:47.940780 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:57:59 crc kubenswrapper[4878]: I1202 18:57:59.938392 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:57:59 crc kubenswrapper[4878]: E1202 18:57:59.939699 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:58:13 crc kubenswrapper[4878]: I1202 18:58:13.938591 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:58:13 crc kubenswrapper[4878]: E1202 18:58:13.939672 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:58:24 crc kubenswrapper[4878]: I1202 18:58:24.940184 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:58:24 crc kubenswrapper[4878]: E1202 18:58:24.940970 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:58:39 crc kubenswrapper[4878]: I1202 18:58:39.939086 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:58:39 crc kubenswrapper[4878]: E1202 18:58:39.940032 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:58:53 crc kubenswrapper[4878]: I1202 18:58:53.940824 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:58:53 crc kubenswrapper[4878]: E1202 18:58:53.942252 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:59:08 crc kubenswrapper[4878]: I1202 18:59:08.941443 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:59:08 crc kubenswrapper[4878]: E1202 18:59:08.942228 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:59:20 crc kubenswrapper[4878]: I1202 18:59:20.946104 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:59:20 crc kubenswrapper[4878]: E1202 18:59:20.946960 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:59:33 crc kubenswrapper[4878]: I1202 18:59:33.937779 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:59:33 crc kubenswrapper[4878]: E1202 18:59:33.938518 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:59:46 crc kubenswrapper[4878]: I1202 18:59:46.939312 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:59:46 crc kubenswrapper[4878]: E1202 18:59:46.940765 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 18:59:57 crc kubenswrapper[4878]: I1202 18:59:57.938293 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 18:59:59 crc kubenswrapper[4878]: I1202 18:59:59.160978 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"ad7be7ffe092b0b37b5f8e2f8bf030483233a5e9f6f947235c408c30f8fb13db"} Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.168642 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd"] Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.188743 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd"] Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.188856 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.203295 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.206718 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.317983 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1d37d-a8b0-470c-b89a-4873b5253f38-config-volume\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.318029 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r6d\" (UniqueName: \"kubernetes.io/projected/97d1d37d-a8b0-470c-b89a-4873b5253f38-kube-api-access-j7r6d\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.318287 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1d37d-a8b0-470c-b89a-4873b5253f38-secret-volume\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.420814 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1d37d-a8b0-470c-b89a-4873b5253f38-secret-volume\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.420932 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1d37d-a8b0-470c-b89a-4873b5253f38-config-volume\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.420970 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r6d\" (UniqueName: \"kubernetes.io/projected/97d1d37d-a8b0-470c-b89a-4873b5253f38-kube-api-access-j7r6d\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.421766 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1d37d-a8b0-470c-b89a-4873b5253f38-config-volume\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.429526 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1d37d-a8b0-470c-b89a-4873b5253f38-secret-volume\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.443381 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r6d\" (UniqueName: \"kubernetes.io/projected/97d1d37d-a8b0-470c-b89a-4873b5253f38-kube-api-access-j7r6d\") pod \"collect-profiles-29411700-swkbd\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:00 crc kubenswrapper[4878]: I1202 19:00:00.534916 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:01 crc kubenswrapper[4878]: I1202 19:00:01.018556 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd"] Dec 02 19:00:01 crc kubenswrapper[4878]: W1202 19:00:01.019449 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d1d37d_a8b0_470c_b89a_4873b5253f38.slice/crio-761756bbc94cfac3953e87ff50a40425c9c425a945646f538edc62737fd17051 WatchSource:0}: Error finding container 761756bbc94cfac3953e87ff50a40425c9c425a945646f538edc62737fd17051: Status 404 returned error can't find the container with id 761756bbc94cfac3953e87ff50a40425c9c425a945646f538edc62737fd17051 Dec 02 19:00:01 crc kubenswrapper[4878]: I1202 19:00:01.193008 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" event={"ID":"97d1d37d-a8b0-470c-b89a-4873b5253f38","Type":"ContainerStarted","Data":"761756bbc94cfac3953e87ff50a40425c9c425a945646f538edc62737fd17051"} Dec 02 19:00:01 crc kubenswrapper[4878]: E1202 19:00:01.950307 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d1d37d_a8b0_470c_b89a_4873b5253f38.slice/crio-d5b5533a2c783737e8b175228613fd7c6b4bfca6360f7ee5b9e1dbed487762eb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d1d37d_a8b0_470c_b89a_4873b5253f38.slice/crio-conmon-d5b5533a2c783737e8b175228613fd7c6b4bfca6360f7ee5b9e1dbed487762eb.scope\": RecentStats: unable to find data in memory cache]" Dec 02 19:00:02 crc kubenswrapper[4878]: I1202 19:00:02.205058 4878 generic.go:334] "Generic (PLEG): container finished" podID="97d1d37d-a8b0-470c-b89a-4873b5253f38" containerID="d5b5533a2c783737e8b175228613fd7c6b4bfca6360f7ee5b9e1dbed487762eb" exitCode=0 Dec 02 19:00:02 crc kubenswrapper[4878]: I1202 19:00:02.205162 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" event={"ID":"97d1d37d-a8b0-470c-b89a-4873b5253f38","Type":"ContainerDied","Data":"d5b5533a2c783737e8b175228613fd7c6b4bfca6360f7ee5b9e1dbed487762eb"} Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.686282 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.718549 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1d37d-a8b0-470c-b89a-4873b5253f38-secret-volume\") pod \"97d1d37d-a8b0-470c-b89a-4873b5253f38\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.718953 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1d37d-a8b0-470c-b89a-4873b5253f38-config-volume\") pod \"97d1d37d-a8b0-470c-b89a-4873b5253f38\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.719055 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7r6d\" (UniqueName: \"kubernetes.io/projected/97d1d37d-a8b0-470c-b89a-4873b5253f38-kube-api-access-j7r6d\") pod \"97d1d37d-a8b0-470c-b89a-4873b5253f38\" (UID: \"97d1d37d-a8b0-470c-b89a-4873b5253f38\") " Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.719593 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97d1d37d-a8b0-470c-b89a-4873b5253f38-config-volume" (OuterVolumeSpecName: "config-volume") pod "97d1d37d-a8b0-470c-b89a-4873b5253f38" (UID: "97d1d37d-a8b0-470c-b89a-4873b5253f38"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.720324 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97d1d37d-a8b0-470c-b89a-4873b5253f38-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.726141 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97d1d37d-a8b0-470c-b89a-4873b5253f38-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97d1d37d-a8b0-470c-b89a-4873b5253f38" (UID: "97d1d37d-a8b0-470c-b89a-4873b5253f38"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.729382 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d1d37d-a8b0-470c-b89a-4873b5253f38-kube-api-access-j7r6d" (OuterVolumeSpecName: "kube-api-access-j7r6d") pod "97d1d37d-a8b0-470c-b89a-4873b5253f38" (UID: "97d1d37d-a8b0-470c-b89a-4873b5253f38"). InnerVolumeSpecName "kube-api-access-j7r6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.822181 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7r6d\" (UniqueName: \"kubernetes.io/projected/97d1d37d-a8b0-470c-b89a-4873b5253f38-kube-api-access-j7r6d\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:03 crc kubenswrapper[4878]: I1202 19:00:03.822214 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97d1d37d-a8b0-470c-b89a-4873b5253f38-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:04 crc kubenswrapper[4878]: I1202 19:00:04.235993 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" event={"ID":"97d1d37d-a8b0-470c-b89a-4873b5253f38","Type":"ContainerDied","Data":"761756bbc94cfac3953e87ff50a40425c9c425a945646f538edc62737fd17051"} Dec 02 19:00:04 crc kubenswrapper[4878]: I1202 19:00:04.236400 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="761756bbc94cfac3953e87ff50a40425c9c425a945646f538edc62737fd17051" Dec 02 19:00:04 crc kubenswrapper[4878]: I1202 19:00:04.236060 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd" Dec 02 19:00:04 crc kubenswrapper[4878]: I1202 19:00:04.787996 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf"] Dec 02 19:00:04 crc kubenswrapper[4878]: I1202 19:00:04.798062 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411655-62mmf"] Dec 02 19:00:04 crc kubenswrapper[4878]: I1202 19:00:04.956276 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="298c0078-6d02-4963-a1fb-0f6713e6d369" path="/var/lib/kubelet/pods/298c0078-6d02-4963-a1fb-0f6713e6d369/volumes" Dec 02 19:00:05 crc kubenswrapper[4878]: I1202 19:00:05.250950 4878 generic.go:334] "Generic (PLEG): container finished" podID="0b199858-7108-4b94-b3f9-692a11430c94" containerID="932cced1af30af79035861a46d0910da1970d06a53f549120120dcb15e12843a" exitCode=0 Dec 02 19:00:05 crc kubenswrapper[4878]: I1202 19:00:05.250994 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" event={"ID":"0b199858-7108-4b94-b3f9-692a11430c94","Type":"ContainerDied","Data":"932cced1af30af79035861a46d0910da1970d06a53f549120120dcb15e12843a"} Dec 02 19:00:06 crc kubenswrapper[4878]: I1202 19:00:06.809153 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 19:00:06 crc kubenswrapper[4878]: I1202 19:00:06.992477 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-ssh-key\") pod \"0b199858-7108-4b94-b3f9-692a11430c94\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " Dec 02 19:00:06 crc kubenswrapper[4878]: I1202 19:00:06.992699 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-secret-0\") pod \"0b199858-7108-4b94-b3f9-692a11430c94\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " Dec 02 19:00:06 crc kubenswrapper[4878]: I1202 19:00:06.992854 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97hz7\" (UniqueName: \"kubernetes.io/projected/0b199858-7108-4b94-b3f9-692a11430c94-kube-api-access-97hz7\") pod \"0b199858-7108-4b94-b3f9-692a11430c94\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " Dec 02 19:00:06 crc kubenswrapper[4878]: I1202 19:00:06.992912 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-inventory\") pod \"0b199858-7108-4b94-b3f9-692a11430c94\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " Dec 02 19:00:06 crc kubenswrapper[4878]: I1202 19:00:06.992953 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-combined-ca-bundle\") pod \"0b199858-7108-4b94-b3f9-692a11430c94\" (UID: \"0b199858-7108-4b94-b3f9-692a11430c94\") " Dec 02 19:00:06 crc kubenswrapper[4878]: I1202 19:00:06.997901 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0b199858-7108-4b94-b3f9-692a11430c94" (UID: "0b199858-7108-4b94-b3f9-692a11430c94"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.008949 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b199858-7108-4b94-b3f9-692a11430c94-kube-api-access-97hz7" (OuterVolumeSpecName: "kube-api-access-97hz7") pod "0b199858-7108-4b94-b3f9-692a11430c94" (UID: "0b199858-7108-4b94-b3f9-692a11430c94"). InnerVolumeSpecName "kube-api-access-97hz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.035317 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b199858-7108-4b94-b3f9-692a11430c94" (UID: "0b199858-7108-4b94-b3f9-692a11430c94"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.068538 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0b199858-7108-4b94-b3f9-692a11430c94" (UID: "0b199858-7108-4b94-b3f9-692a11430c94"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.072071 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-inventory" (OuterVolumeSpecName: "inventory") pod "0b199858-7108-4b94-b3f9-692a11430c94" (UID: "0b199858-7108-4b94-b3f9-692a11430c94"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.096555 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.096591 4878 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.096606 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97hz7\" (UniqueName: \"kubernetes.io/projected/0b199858-7108-4b94-b3f9-692a11430c94-kube-api-access-97hz7\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.096619 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.096631 4878 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b199858-7108-4b94-b3f9-692a11430c94-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.280137 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" event={"ID":"0b199858-7108-4b94-b3f9-692a11430c94","Type":"ContainerDied","Data":"fb3f7ed320b4fd3ffbdfdff6e63ddaf41be917312869e3ae8bf1ab4d23a8e079"} Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.280521 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb3f7ed320b4fd3ffbdfdff6e63ddaf41be917312869e3ae8bf1ab4d23a8e079" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.280210 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.384574 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8"] Dec 02 19:00:07 crc kubenswrapper[4878]: E1202 19:00:07.385033 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b199858-7108-4b94-b3f9-692a11430c94" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.385052 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b199858-7108-4b94-b3f9-692a11430c94" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 19:00:07 crc kubenswrapper[4878]: E1202 19:00:07.385099 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d1d37d-a8b0-470c-b89a-4873b5253f38" containerName="collect-profiles" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.385106 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d1d37d-a8b0-470c-b89a-4873b5253f38" containerName="collect-profiles" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.385329 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d1d37d-a8b0-470c-b89a-4873b5253f38" containerName="collect-profiles" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.385346 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b199858-7108-4b94-b3f9-692a11430c94" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.386097 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.388325 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.388338 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.388484 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.388713 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.388814 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.389183 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.389739 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.428838 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8"] Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.505618 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.505663 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.506809 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qc44\" (UniqueName: \"kubernetes.io/projected/53bb65b6-2ee5-42dd-8a1d-df8a04008975-kube-api-access-4qc44\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.506924 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.507028 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.507340 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.507415 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.507528 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.507600 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.609123 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.609390 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.609565 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qc44\" (UniqueName: \"kubernetes.io/projected/53bb65b6-2ee5-42dd-8a1d-df8a04008975-kube-api-access-4qc44\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.609672 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.610436 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.610534 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.610607 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.610706 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.610807 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.611661 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.614420 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.614578 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.615173 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.616150 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.618379 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.622321 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.625157 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.630588 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qc44\" (UniqueName: \"kubernetes.io/projected/53bb65b6-2ee5-42dd-8a1d-df8a04008975-kube-api-access-4qc44\") pod \"nova-edpm-deployment-openstack-edpm-ipam-csjn8\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:07 crc kubenswrapper[4878]: I1202 19:00:07.703213 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:00:08 crc kubenswrapper[4878]: I1202 19:00:08.343366 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8"] Dec 02 19:00:08 crc kubenswrapper[4878]: W1202 19:00:08.345580 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53bb65b6_2ee5_42dd_8a1d_df8a04008975.slice/crio-03b1a35b0e8569637a4c9b04887bd1014c01c0ea11c624cf8c70846406d296bd WatchSource:0}: Error finding container 03b1a35b0e8569637a4c9b04887bd1014c01c0ea11c624cf8c70846406d296bd: Status 404 returned error can't find the container with id 03b1a35b0e8569637a4c9b04887bd1014c01c0ea11c624cf8c70846406d296bd Dec 02 19:00:08 crc kubenswrapper[4878]: I1202 19:00:08.350384 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:00:09 crc kubenswrapper[4878]: I1202 19:00:09.321256 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" event={"ID":"53bb65b6-2ee5-42dd-8a1d-df8a04008975","Type":"ContainerStarted","Data":"e6795cda75df82ae50fd5cae02f6a3390170d7c3fb7ac87d5694ad2605b36383"} Dec 02 19:00:09 crc kubenswrapper[4878]: I1202 19:00:09.324179 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" event={"ID":"53bb65b6-2ee5-42dd-8a1d-df8a04008975","Type":"ContainerStarted","Data":"03b1a35b0e8569637a4c9b04887bd1014c01c0ea11c624cf8c70846406d296bd"} Dec 02 19:00:09 crc kubenswrapper[4878]: I1202 19:00:09.356188 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" podStartSLOduration=1.989979635 podStartE2EDuration="2.356169384s" podCreationTimestamp="2025-12-02 19:00:07 +0000 UTC" firstStartedPulling="2025-12-02 19:00:08.349897093 +0000 UTC m=+2718.039516014" lastFinishedPulling="2025-12-02 19:00:08.716086882 +0000 UTC m=+2718.405705763" observedRunningTime="2025-12-02 19:00:09.345925207 +0000 UTC m=+2719.035544138" watchObservedRunningTime="2025-12-02 19:00:09.356169384 +0000 UTC m=+2719.045788265" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.358843 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nbh2p"] Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.362829 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.374768 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nbh2p"] Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.536821 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-utilities\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.536923 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-catalog-content\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.536989 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvfhz\" (UniqueName: \"kubernetes.io/projected/42ee7757-4fc0-4b52-8367-6993b3405b74-kube-api-access-bvfhz\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.638714 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-catalog-content\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.638798 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvfhz\" (UniqueName: \"kubernetes.io/projected/42ee7757-4fc0-4b52-8367-6993b3405b74-kube-api-access-bvfhz\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.638949 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-utilities\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.639442 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-catalog-content\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.639504 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-utilities\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.662397 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvfhz\" (UniqueName: \"kubernetes.io/projected/42ee7757-4fc0-4b52-8367-6993b3405b74-kube-api-access-bvfhz\") pod \"certified-operators-nbh2p\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:23 crc kubenswrapper[4878]: I1202 19:00:23.701608 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:24 crc kubenswrapper[4878]: I1202 19:00:24.303863 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nbh2p"] Dec 02 19:00:24 crc kubenswrapper[4878]: I1202 19:00:24.497451 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbh2p" event={"ID":"42ee7757-4fc0-4b52-8367-6993b3405b74","Type":"ContainerStarted","Data":"2f613ab6da99ad2c6faf109c43a4cb7a53dcfd501eedf7dc2c7b9714df03c159"} Dec 02 19:00:25 crc kubenswrapper[4878]: I1202 19:00:25.533097 4878 generic.go:334] "Generic (PLEG): container finished" podID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerID="4ab509e5787a1d210c0e518acf586c0ea3ccc759301ec2d917bd489fb162d849" exitCode=0 Dec 02 19:00:25 crc kubenswrapper[4878]: I1202 19:00:25.533162 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbh2p" event={"ID":"42ee7757-4fc0-4b52-8367-6993b3405b74","Type":"ContainerDied","Data":"4ab509e5787a1d210c0e518acf586c0ea3ccc759301ec2d917bd489fb162d849"} Dec 02 19:00:27 crc kubenswrapper[4878]: I1202 19:00:27.554588 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbh2p" event={"ID":"42ee7757-4fc0-4b52-8367-6993b3405b74","Type":"ContainerStarted","Data":"6eef82cc3e37bdb7d3c19937c3f81e09104c57dea7ff35641c7e448543da080b"} Dec 02 19:00:28 crc kubenswrapper[4878]: I1202 19:00:28.567549 4878 generic.go:334] "Generic (PLEG): container finished" podID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerID="6eef82cc3e37bdb7d3c19937c3f81e09104c57dea7ff35641c7e448543da080b" exitCode=0 Dec 02 19:00:28 crc kubenswrapper[4878]: I1202 19:00:28.567592 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbh2p" event={"ID":"42ee7757-4fc0-4b52-8367-6993b3405b74","Type":"ContainerDied","Data":"6eef82cc3e37bdb7d3c19937c3f81e09104c57dea7ff35641c7e448543da080b"} Dec 02 19:00:29 crc kubenswrapper[4878]: I1202 19:00:29.582443 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbh2p" event={"ID":"42ee7757-4fc0-4b52-8367-6993b3405b74","Type":"ContainerStarted","Data":"efdafe7ac233f02ca92f8799104e382af8fd76c9b23e8a33f2c30b42d401c757"} Dec 02 19:00:29 crc kubenswrapper[4878]: I1202 19:00:29.603524 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nbh2p" podStartSLOduration=3.203284442 podStartE2EDuration="6.603506717s" podCreationTimestamp="2025-12-02 19:00:23 +0000 UTC" firstStartedPulling="2025-12-02 19:00:25.535967494 +0000 UTC m=+2735.225586385" lastFinishedPulling="2025-12-02 19:00:28.936189779 +0000 UTC m=+2738.625808660" observedRunningTime="2025-12-02 19:00:29.601412492 +0000 UTC m=+2739.291031373" watchObservedRunningTime="2025-12-02 19:00:29.603506717 +0000 UTC m=+2739.293125598" Dec 02 19:00:33 crc kubenswrapper[4878]: I1202 19:00:33.702483 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:33 crc kubenswrapper[4878]: I1202 19:00:33.703108 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:33 crc kubenswrapper[4878]: I1202 19:00:33.758752 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:34 crc kubenswrapper[4878]: I1202 19:00:34.696671 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:34 crc kubenswrapper[4878]: I1202 19:00:34.751023 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nbh2p"] Dec 02 19:00:36 crc kubenswrapper[4878]: I1202 19:00:36.666847 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nbh2p" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="registry-server" containerID="cri-o://efdafe7ac233f02ca92f8799104e382af8fd76c9b23e8a33f2c30b42d401c757" gracePeriod=2 Dec 02 19:00:37 crc kubenswrapper[4878]: I1202 19:00:37.679785 4878 generic.go:334] "Generic (PLEG): container finished" podID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerID="efdafe7ac233f02ca92f8799104e382af8fd76c9b23e8a33f2c30b42d401c757" exitCode=0 Dec 02 19:00:37 crc kubenswrapper[4878]: I1202 19:00:37.679856 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbh2p" event={"ID":"42ee7757-4fc0-4b52-8367-6993b3405b74","Type":"ContainerDied","Data":"efdafe7ac233f02ca92f8799104e382af8fd76c9b23e8a33f2c30b42d401c757"} Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.347561 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.383812 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-utilities\") pod \"42ee7757-4fc0-4b52-8367-6993b3405b74\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.383857 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-catalog-content\") pod \"42ee7757-4fc0-4b52-8367-6993b3405b74\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.383944 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvfhz\" (UniqueName: \"kubernetes.io/projected/42ee7757-4fc0-4b52-8367-6993b3405b74-kube-api-access-bvfhz\") pod \"42ee7757-4fc0-4b52-8367-6993b3405b74\" (UID: \"42ee7757-4fc0-4b52-8367-6993b3405b74\") " Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.385688 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-utilities" (OuterVolumeSpecName: "utilities") pod "42ee7757-4fc0-4b52-8367-6993b3405b74" (UID: "42ee7757-4fc0-4b52-8367-6993b3405b74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.390036 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ee7757-4fc0-4b52-8367-6993b3405b74-kube-api-access-bvfhz" (OuterVolumeSpecName: "kube-api-access-bvfhz") pod "42ee7757-4fc0-4b52-8367-6993b3405b74" (UID: "42ee7757-4fc0-4b52-8367-6993b3405b74"). InnerVolumeSpecName "kube-api-access-bvfhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.444738 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42ee7757-4fc0-4b52-8367-6993b3405b74" (UID: "42ee7757-4fc0-4b52-8367-6993b3405b74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.487437 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.487488 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42ee7757-4fc0-4b52-8367-6993b3405b74-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.487503 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvfhz\" (UniqueName: \"kubernetes.io/projected/42ee7757-4fc0-4b52-8367-6993b3405b74-kube-api-access-bvfhz\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.695820 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nbh2p" event={"ID":"42ee7757-4fc0-4b52-8367-6993b3405b74","Type":"ContainerDied","Data":"2f613ab6da99ad2c6faf109c43a4cb7a53dcfd501eedf7dc2c7b9714df03c159"} Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.695878 4878 scope.go:117] "RemoveContainer" containerID="efdafe7ac233f02ca92f8799104e382af8fd76c9b23e8a33f2c30b42d401c757" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.695893 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nbh2p" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.731556 4878 scope.go:117] "RemoveContainer" containerID="6eef82cc3e37bdb7d3c19937c3f81e09104c57dea7ff35641c7e448543da080b" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.741533 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nbh2p"] Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.756855 4878 scope.go:117] "RemoveContainer" containerID="4ab509e5787a1d210c0e518acf586c0ea3ccc759301ec2d917bd489fb162d849" Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.757551 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nbh2p"] Dec 02 19:00:38 crc kubenswrapper[4878]: I1202 19:00:38.951455 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" path="/var/lib/kubelet/pods/42ee7757-4fc0-4b52-8367-6993b3405b74/volumes" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.610554 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8hqzj"] Dec 02 19:00:40 crc kubenswrapper[4878]: E1202 19:00:40.611995 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="registry-server" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.612029 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="registry-server" Dec 02 19:00:40 crc kubenswrapper[4878]: E1202 19:00:40.612060 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="extract-content" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.612077 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="extract-content" Dec 02 19:00:40 crc kubenswrapper[4878]: E1202 19:00:40.612117 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="extract-utilities" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.612135 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="extract-utilities" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.612859 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ee7757-4fc0-4b52-8367-6993b3405b74" containerName="registry-server" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.616821 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.661324 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hqzj"] Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.752173 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-utilities\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.752552 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blrp7\" (UniqueName: \"kubernetes.io/projected/ec7e9556-e873-4fee-a7cd-1bc9551573e4-kube-api-access-blrp7\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.752738 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-catalog-content\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.854503 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blrp7\" (UniqueName: \"kubernetes.io/projected/ec7e9556-e873-4fee-a7cd-1bc9551573e4-kube-api-access-blrp7\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.854604 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-catalog-content\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.854700 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-utilities\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.855165 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-catalog-content\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.855217 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-utilities\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.878129 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blrp7\" (UniqueName: \"kubernetes.io/projected/ec7e9556-e873-4fee-a7cd-1bc9551573e4-kube-api-access-blrp7\") pod \"redhat-marketplace-8hqzj\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:40 crc kubenswrapper[4878]: I1202 19:00:40.942545 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:41 crc kubenswrapper[4878]: W1202 19:00:41.474542 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec7e9556_e873_4fee_a7cd_1bc9551573e4.slice/crio-dc65797fa8dd2781c83cc8cac22d87a1c91fbe51d8104645f3f3b7870a1496b4 WatchSource:0}: Error finding container dc65797fa8dd2781c83cc8cac22d87a1c91fbe51d8104645f3f3b7870a1496b4: Status 404 returned error can't find the container with id dc65797fa8dd2781c83cc8cac22d87a1c91fbe51d8104645f3f3b7870a1496b4 Dec 02 19:00:41 crc kubenswrapper[4878]: I1202 19:00:41.480821 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hqzj"] Dec 02 19:00:41 crc kubenswrapper[4878]: I1202 19:00:41.726365 4878 generic.go:334] "Generic (PLEG): container finished" podID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerID="4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e" exitCode=0 Dec 02 19:00:41 crc kubenswrapper[4878]: I1202 19:00:41.727258 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hqzj" event={"ID":"ec7e9556-e873-4fee-a7cd-1bc9551573e4","Type":"ContainerDied","Data":"4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e"} Dec 02 19:00:41 crc kubenswrapper[4878]: I1202 19:00:41.727336 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hqzj" event={"ID":"ec7e9556-e873-4fee-a7cd-1bc9551573e4","Type":"ContainerStarted","Data":"dc65797fa8dd2781c83cc8cac22d87a1c91fbe51d8104645f3f3b7870a1496b4"} Dec 02 19:00:42 crc kubenswrapper[4878]: I1202 19:00:42.743107 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hqzj" event={"ID":"ec7e9556-e873-4fee-a7cd-1bc9551573e4","Type":"ContainerStarted","Data":"8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c"} Dec 02 19:00:43 crc kubenswrapper[4878]: I1202 19:00:43.758334 4878 generic.go:334] "Generic (PLEG): container finished" podID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerID="8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c" exitCode=0 Dec 02 19:00:43 crc kubenswrapper[4878]: I1202 19:00:43.758399 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hqzj" event={"ID":"ec7e9556-e873-4fee-a7cd-1bc9551573e4","Type":"ContainerDied","Data":"8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c"} Dec 02 19:00:44 crc kubenswrapper[4878]: I1202 19:00:44.781680 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hqzj" event={"ID":"ec7e9556-e873-4fee-a7cd-1bc9551573e4","Type":"ContainerStarted","Data":"4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3"} Dec 02 19:00:44 crc kubenswrapper[4878]: I1202 19:00:44.808447 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8hqzj" podStartSLOduration=2.253737896 podStartE2EDuration="4.808427829s" podCreationTimestamp="2025-12-02 19:00:40 +0000 UTC" firstStartedPulling="2025-12-02 19:00:41.728992054 +0000 UTC m=+2751.418610925" lastFinishedPulling="2025-12-02 19:00:44.283681947 +0000 UTC m=+2753.973300858" observedRunningTime="2025-12-02 19:00:44.808411449 +0000 UTC m=+2754.498030330" watchObservedRunningTime="2025-12-02 19:00:44.808427829 +0000 UTC m=+2754.498046710" Dec 02 19:00:50 crc kubenswrapper[4878]: I1202 19:00:50.961312 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:50 crc kubenswrapper[4878]: I1202 19:00:50.963012 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:51 crc kubenswrapper[4878]: I1202 19:00:51.002369 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:51 crc kubenswrapper[4878]: I1202 19:00:51.922970 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:52 crc kubenswrapper[4878]: I1202 19:00:52.029306 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hqzj"] Dec 02 19:00:53 crc kubenswrapper[4878]: I1202 19:00:53.876595 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8hqzj" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="registry-server" containerID="cri-o://4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3" gracePeriod=2 Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.395742 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.507341 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blrp7\" (UniqueName: \"kubernetes.io/projected/ec7e9556-e873-4fee-a7cd-1bc9551573e4-kube-api-access-blrp7\") pod \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.507480 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-catalog-content\") pod \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.507775 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-utilities\") pod \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\" (UID: \"ec7e9556-e873-4fee-a7cd-1bc9551573e4\") " Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.508377 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-utilities" (OuterVolumeSpecName: "utilities") pod "ec7e9556-e873-4fee-a7cd-1bc9551573e4" (UID: "ec7e9556-e873-4fee-a7cd-1bc9551573e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.508537 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.514341 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7e9556-e873-4fee-a7cd-1bc9551573e4-kube-api-access-blrp7" (OuterVolumeSpecName: "kube-api-access-blrp7") pod "ec7e9556-e873-4fee-a7cd-1bc9551573e4" (UID: "ec7e9556-e873-4fee-a7cd-1bc9551573e4"). InnerVolumeSpecName "kube-api-access-blrp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.526888 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec7e9556-e873-4fee-a7cd-1bc9551573e4" (UID: "ec7e9556-e873-4fee-a7cd-1bc9551573e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.609806 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blrp7\" (UniqueName: \"kubernetes.io/projected/ec7e9556-e873-4fee-a7cd-1bc9551573e4-kube-api-access-blrp7\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.609843 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7e9556-e873-4fee-a7cd-1bc9551573e4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.888342 4878 generic.go:334] "Generic (PLEG): container finished" podID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerID="4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3" exitCode=0 Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.888378 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hqzj" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.888397 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hqzj" event={"ID":"ec7e9556-e873-4fee-a7cd-1bc9551573e4","Type":"ContainerDied","Data":"4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3"} Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.888432 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hqzj" event={"ID":"ec7e9556-e873-4fee-a7cd-1bc9551573e4","Type":"ContainerDied","Data":"dc65797fa8dd2781c83cc8cac22d87a1c91fbe51d8104645f3f3b7870a1496b4"} Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.888452 4878 scope.go:117] "RemoveContainer" containerID="4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.924192 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hqzj"] Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.930305 4878 scope.go:117] "RemoveContainer" containerID="8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.954801 4878 scope.go:117] "RemoveContainer" containerID="4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e" Dec 02 19:00:54 crc kubenswrapper[4878]: I1202 19:00:54.963643 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hqzj"] Dec 02 19:00:55 crc kubenswrapper[4878]: I1202 19:00:55.005134 4878 scope.go:117] "RemoveContainer" containerID="4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3" Dec 02 19:00:55 crc kubenswrapper[4878]: E1202 19:00:55.005939 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3\": container with ID starting with 4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3 not found: ID does not exist" containerID="4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3" Dec 02 19:00:55 crc kubenswrapper[4878]: I1202 19:00:55.005977 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3"} err="failed to get container status \"4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3\": rpc error: code = NotFound desc = could not find container \"4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3\": container with ID starting with 4ee778d1b89f2627037662376dac7d71e40f8c4c8b2030358e96bc825420a9c3 not found: ID does not exist" Dec 02 19:00:55 crc kubenswrapper[4878]: I1202 19:00:55.006008 4878 scope.go:117] "RemoveContainer" containerID="8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c" Dec 02 19:00:55 crc kubenswrapper[4878]: E1202 19:00:55.006287 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c\": container with ID starting with 8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c not found: ID does not exist" containerID="8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c" Dec 02 19:00:55 crc kubenswrapper[4878]: I1202 19:00:55.006320 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c"} err="failed to get container status \"8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c\": rpc error: code = NotFound desc = could not find container \"8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c\": container with ID starting with 8353311c59e3704592213cfbf3e4d43bf83bfd746f726301560d88356ea1522c not found: ID does not exist" Dec 02 19:00:55 crc kubenswrapper[4878]: I1202 19:00:55.006345 4878 scope.go:117] "RemoveContainer" containerID="4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e" Dec 02 19:00:55 crc kubenswrapper[4878]: E1202 19:00:55.006638 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e\": container with ID starting with 4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e not found: ID does not exist" containerID="4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e" Dec 02 19:00:55 crc kubenswrapper[4878]: I1202 19:00:55.006681 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e"} err="failed to get container status \"4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e\": rpc error: code = NotFound desc = could not find container \"4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e\": container with ID starting with 4c634bf41edababe4942659bf9e54cfcb1bde443c912798efdaa11e7a1a4032e not found: ID does not exist" Dec 02 19:00:56 crc kubenswrapper[4878]: I1202 19:00:56.953717 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" path="/var/lib/kubelet/pods/ec7e9556-e873-4fee-a7cd-1bc9551573e4/volumes" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.161860 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411701-sf57f"] Dec 02 19:01:00 crc kubenswrapper[4878]: E1202 19:01:00.162690 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="extract-content" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.162702 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="extract-content" Dec 02 19:01:00 crc kubenswrapper[4878]: E1202 19:01:00.162718 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="extract-utilities" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.162725 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="extract-utilities" Dec 02 19:01:00 crc kubenswrapper[4878]: E1202 19:01:00.162779 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="registry-server" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.162785 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="registry-server" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.162987 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec7e9556-e873-4fee-a7cd-1bc9551573e4" containerName="registry-server" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.163790 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.348794 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411701-sf57f"] Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.356725 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-config-data\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.356969 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-combined-ca-bundle\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.357026 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-fernet-keys\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.357103 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psms6\" (UniqueName: \"kubernetes.io/projected/08153e97-46ad-4405-b0ea-7f4606a82c6f-kube-api-access-psms6\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.459579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-config-data\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.459923 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-combined-ca-bundle\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.459992 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-fernet-keys\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.460106 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psms6\" (UniqueName: \"kubernetes.io/projected/08153e97-46ad-4405-b0ea-7f4606a82c6f-kube-api-access-psms6\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.468002 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-combined-ca-bundle\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.468083 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-config-data\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.468812 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-fernet-keys\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.478694 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psms6\" (UniqueName: \"kubernetes.io/projected/08153e97-46ad-4405-b0ea-7f4606a82c6f-kube-api-access-psms6\") pod \"keystone-cron-29411701-sf57f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:00 crc kubenswrapper[4878]: I1202 19:01:00.482353 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:01 crc kubenswrapper[4878]: I1202 19:01:01.018086 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411701-sf57f"] Dec 02 19:01:01 crc kubenswrapper[4878]: I1202 19:01:01.994049 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411701-sf57f" event={"ID":"08153e97-46ad-4405-b0ea-7f4606a82c6f","Type":"ContainerStarted","Data":"43b6714638347b069234a95fe22c60f01eae4b8b9587f6ac0e8992bb482c1e6b"} Dec 02 19:01:01 crc kubenswrapper[4878]: I1202 19:01:01.994613 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411701-sf57f" event={"ID":"08153e97-46ad-4405-b0ea-7f4606a82c6f","Type":"ContainerStarted","Data":"caac45a3998cb1b4c444a64512a578a0e0cfb5f6816ec22281c51b175276f528"} Dec 02 19:01:02 crc kubenswrapper[4878]: I1202 19:01:02.010114 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411701-sf57f" podStartSLOduration=2.010089203 podStartE2EDuration="2.010089203s" podCreationTimestamp="2025-12-02 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:01:02.00968452 +0000 UTC m=+2771.699303411" watchObservedRunningTime="2025-12-02 19:01:02.010089203 +0000 UTC m=+2771.699708114" Dec 02 19:01:04 crc kubenswrapper[4878]: I1202 19:01:04.338728 4878 scope.go:117] "RemoveContainer" containerID="3225532b534ec187daab9b2536a58912dece1a41bb317aa0b7d22fc707ce2393" Dec 02 19:01:05 crc kubenswrapper[4878]: I1202 19:01:05.036046 4878 generic.go:334] "Generic (PLEG): container finished" podID="08153e97-46ad-4405-b0ea-7f4606a82c6f" containerID="43b6714638347b069234a95fe22c60f01eae4b8b9587f6ac0e8992bb482c1e6b" exitCode=0 Dec 02 19:01:05 crc kubenswrapper[4878]: I1202 19:01:05.036152 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411701-sf57f" event={"ID":"08153e97-46ad-4405-b0ea-7f4606a82c6f","Type":"ContainerDied","Data":"43b6714638347b069234a95fe22c60f01eae4b8b9587f6ac0e8992bb482c1e6b"} Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.505764 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.628728 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-fernet-keys\") pod \"08153e97-46ad-4405-b0ea-7f4606a82c6f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.629125 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-combined-ca-bundle\") pod \"08153e97-46ad-4405-b0ea-7f4606a82c6f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.629158 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psms6\" (UniqueName: \"kubernetes.io/projected/08153e97-46ad-4405-b0ea-7f4606a82c6f-kube-api-access-psms6\") pod \"08153e97-46ad-4405-b0ea-7f4606a82c6f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.629220 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-config-data\") pod \"08153e97-46ad-4405-b0ea-7f4606a82c6f\" (UID: \"08153e97-46ad-4405-b0ea-7f4606a82c6f\") " Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.635294 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "08153e97-46ad-4405-b0ea-7f4606a82c6f" (UID: "08153e97-46ad-4405-b0ea-7f4606a82c6f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.636049 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08153e97-46ad-4405-b0ea-7f4606a82c6f-kube-api-access-psms6" (OuterVolumeSpecName: "kube-api-access-psms6") pod "08153e97-46ad-4405-b0ea-7f4606a82c6f" (UID: "08153e97-46ad-4405-b0ea-7f4606a82c6f"). InnerVolumeSpecName "kube-api-access-psms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.682513 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08153e97-46ad-4405-b0ea-7f4606a82c6f" (UID: "08153e97-46ad-4405-b0ea-7f4606a82c6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.725432 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-config-data" (OuterVolumeSpecName: "config-data") pod "08153e97-46ad-4405-b0ea-7f4606a82c6f" (UID: "08153e97-46ad-4405-b0ea-7f4606a82c6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.732649 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.732687 4878 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.732701 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08153e97-46ad-4405-b0ea-7f4606a82c6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:01:06 crc kubenswrapper[4878]: I1202 19:01:06.732716 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psms6\" (UniqueName: \"kubernetes.io/projected/08153e97-46ad-4405-b0ea-7f4606a82c6f-kube-api-access-psms6\") on node \"crc\" DevicePath \"\"" Dec 02 19:01:07 crc kubenswrapper[4878]: I1202 19:01:07.060224 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411701-sf57f" event={"ID":"08153e97-46ad-4405-b0ea-7f4606a82c6f","Type":"ContainerDied","Data":"caac45a3998cb1b4c444a64512a578a0e0cfb5f6816ec22281c51b175276f528"} Dec 02 19:01:07 crc kubenswrapper[4878]: I1202 19:01:07.060310 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caac45a3998cb1b4c444a64512a578a0e0cfb5f6816ec22281c51b175276f528" Dec 02 19:01:07 crc kubenswrapper[4878]: I1202 19:01:07.060368 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411701-sf57f" Dec 02 19:02:23 crc kubenswrapper[4878]: I1202 19:02:23.753527 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:02:23 crc kubenswrapper[4878]: I1202 19:02:23.754225 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:02:53 crc kubenswrapper[4878]: I1202 19:02:53.742702 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:02:53 crc kubenswrapper[4878]: I1202 19:02:53.744444 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:02:57 crc kubenswrapper[4878]: I1202 19:02:57.646474 4878 generic.go:334] "Generic (PLEG): container finished" podID="53bb65b6-2ee5-42dd-8a1d-df8a04008975" containerID="e6795cda75df82ae50fd5cae02f6a3390170d7c3fb7ac87d5694ad2605b36383" exitCode=0 Dec 02 19:02:57 crc kubenswrapper[4878]: I1202 19:02:57.646624 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" event={"ID":"53bb65b6-2ee5-42dd-8a1d-df8a04008975","Type":"ContainerDied","Data":"e6795cda75df82ae50fd5cae02f6a3390170d7c3fb7ac87d5694ad2605b36383"} Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.328139 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.426926 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-inventory\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427009 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-ssh-key\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427037 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-combined-ca-bundle\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427074 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-1\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427139 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qc44\" (UniqueName: \"kubernetes.io/projected/53bb65b6-2ee5-42dd-8a1d-df8a04008975-kube-api-access-4qc44\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427167 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-0\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427197 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-0\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427228 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-1\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.427311 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-extra-config-0\") pod \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\" (UID: \"53bb65b6-2ee5-42dd-8a1d-df8a04008975\") " Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.446556 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bb65b6-2ee5-42dd-8a1d-df8a04008975-kube-api-access-4qc44" (OuterVolumeSpecName: "kube-api-access-4qc44") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "kube-api-access-4qc44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.451148 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.460033 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-inventory" (OuterVolumeSpecName: "inventory") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.471420 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.482690 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.494292 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.498678 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.507581 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.512644 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "53bb65b6-2ee5-42dd-8a1d-df8a04008975" (UID: "53bb65b6-2ee5-42dd-8a1d-df8a04008975"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529479 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529508 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529517 4878 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529530 4878 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529538 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qc44\" (UniqueName: \"kubernetes.io/projected/53bb65b6-2ee5-42dd-8a1d-df8a04008975-kube-api-access-4qc44\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529546 4878 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529555 4878 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529565 4878 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.529574 4878 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/53bb65b6-2ee5-42dd-8a1d-df8a04008975-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.670691 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" event={"ID":"53bb65b6-2ee5-42dd-8a1d-df8a04008975","Type":"ContainerDied","Data":"03b1a35b0e8569637a4c9b04887bd1014c01c0ea11c624cf8c70846406d296bd"} Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.670749 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03b1a35b0e8569637a4c9b04887bd1014c01c0ea11c624cf8c70846406d296bd" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.670829 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-csjn8" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.780505 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq"] Dec 02 19:02:59 crc kubenswrapper[4878]: E1202 19:02:59.780965 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bb65b6-2ee5-42dd-8a1d-df8a04008975" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.780982 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bb65b6-2ee5-42dd-8a1d-df8a04008975" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 19:02:59 crc kubenswrapper[4878]: E1202 19:02:59.781002 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08153e97-46ad-4405-b0ea-7f4606a82c6f" containerName="keystone-cron" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.781009 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="08153e97-46ad-4405-b0ea-7f4606a82c6f" containerName="keystone-cron" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.781294 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bb65b6-2ee5-42dd-8a1d-df8a04008975" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.781319 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="08153e97-46ad-4405-b0ea-7f4606a82c6f" containerName="keystone-cron" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.782123 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.784804 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.784893 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.784963 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.784820 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.786123 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.806449 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq"] Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.835350 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.835439 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.835476 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.835507 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.835636 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.835806 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxhg2\" (UniqueName: \"kubernetes.io/projected/1d2ce001-5523-44c3-b911-47c3f44ffb77-kube-api-access-mxhg2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.835845 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.937450 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxhg2\" (UniqueName: \"kubernetes.io/projected/1d2ce001-5523-44c3-b911-47c3f44ffb77-kube-api-access-mxhg2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.937547 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.937616 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.937678 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.937717 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.937760 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.937880 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.943434 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.943655 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.945543 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.947785 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.947785 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.948552 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:02:59 crc kubenswrapper[4878]: I1202 19:02:59.955940 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxhg2\" (UniqueName: \"kubernetes.io/projected/1d2ce001-5523-44c3-b911-47c3f44ffb77-kube-api-access-mxhg2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:03:00 crc kubenswrapper[4878]: I1202 19:03:00.102981 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:03:00 crc kubenswrapper[4878]: I1202 19:03:00.508561 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq"] Dec 02 19:03:00 crc kubenswrapper[4878]: I1202 19:03:00.682427 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" event={"ID":"1d2ce001-5523-44c3-b911-47c3f44ffb77","Type":"ContainerStarted","Data":"8741830a0d3b2ab313f425a7d2a1926709d4d1731f81ceeb8c04f080a62044ca"} Dec 02 19:03:01 crc kubenswrapper[4878]: I1202 19:03:01.693792 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" event={"ID":"1d2ce001-5523-44c3-b911-47c3f44ffb77","Type":"ContainerStarted","Data":"7ee8435f2d9bbe0b57b672eadf7716745b9638854d49f4efb7060b6a76ff0ade"} Dec 02 19:03:01 crc kubenswrapper[4878]: I1202 19:03:01.719869 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" podStartSLOduration=2.243789494 podStartE2EDuration="2.719848594s" podCreationTimestamp="2025-12-02 19:02:59 +0000 UTC" firstStartedPulling="2025-12-02 19:03:00.509970522 +0000 UTC m=+2890.199589403" lastFinishedPulling="2025-12-02 19:03:00.986029612 +0000 UTC m=+2890.675648503" observedRunningTime="2025-12-02 19:03:01.716529961 +0000 UTC m=+2891.406148892" watchObservedRunningTime="2025-12-02 19:03:01.719848594 +0000 UTC m=+2891.409467475" Dec 02 19:03:23 crc kubenswrapper[4878]: I1202 19:03:23.742472 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:03:23 crc kubenswrapper[4878]: I1202 19:03:23.743146 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:03:23 crc kubenswrapper[4878]: I1202 19:03:23.743206 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:03:23 crc kubenswrapper[4878]: I1202 19:03:23.744371 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad7be7ffe092b0b37b5f8e2f8bf030483233a5e9f6f947235c408c30f8fb13db"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:03:23 crc kubenswrapper[4878]: I1202 19:03:23.744458 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://ad7be7ffe092b0b37b5f8e2f8bf030483233a5e9f6f947235c408c30f8fb13db" gracePeriod=600 Dec 02 19:03:24 crc kubenswrapper[4878]: I1202 19:03:24.006465 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="ad7be7ffe092b0b37b5f8e2f8bf030483233a5e9f6f947235c408c30f8fb13db" exitCode=0 Dec 02 19:03:24 crc kubenswrapper[4878]: I1202 19:03:24.006529 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"ad7be7ffe092b0b37b5f8e2f8bf030483233a5e9f6f947235c408c30f8fb13db"} Dec 02 19:03:24 crc kubenswrapper[4878]: I1202 19:03:24.006942 4878 scope.go:117] "RemoveContainer" containerID="2a357ef5a09bb2d64285577b4680b909a9cfe74387de7881e38dd56b5b6de62a" Dec 02 19:03:25 crc kubenswrapper[4878]: I1202 19:03:25.025054 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9"} Dec 02 19:05:26 crc kubenswrapper[4878]: I1202 19:05:26.661168 4878 generic.go:334] "Generic (PLEG): container finished" podID="1d2ce001-5523-44c3-b911-47c3f44ffb77" containerID="7ee8435f2d9bbe0b57b672eadf7716745b9638854d49f4efb7060b6a76ff0ade" exitCode=0 Dec 02 19:05:26 crc kubenswrapper[4878]: I1202 19:05:26.661261 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" event={"ID":"1d2ce001-5523-44c3-b911-47c3f44ffb77","Type":"ContainerDied","Data":"7ee8435f2d9bbe0b57b672eadf7716745b9638854d49f4efb7060b6a76ff0ade"} Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.284348 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.410065 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-1\") pod \"1d2ce001-5523-44c3-b911-47c3f44ffb77\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.410196 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-2\") pod \"1d2ce001-5523-44c3-b911-47c3f44ffb77\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.410258 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ssh-key\") pod \"1d2ce001-5523-44c3-b911-47c3f44ffb77\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.410298 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-telemetry-combined-ca-bundle\") pod \"1d2ce001-5523-44c3-b911-47c3f44ffb77\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.410331 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-0\") pod \"1d2ce001-5523-44c3-b911-47c3f44ffb77\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.410450 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-inventory\") pod \"1d2ce001-5523-44c3-b911-47c3f44ffb77\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.410497 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxhg2\" (UniqueName: \"kubernetes.io/projected/1d2ce001-5523-44c3-b911-47c3f44ffb77-kube-api-access-mxhg2\") pod \"1d2ce001-5523-44c3-b911-47c3f44ffb77\" (UID: \"1d2ce001-5523-44c3-b911-47c3f44ffb77\") " Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.417341 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1d2ce001-5523-44c3-b911-47c3f44ffb77" (UID: "1d2ce001-5523-44c3-b911-47c3f44ffb77"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.418446 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2ce001-5523-44c3-b911-47c3f44ffb77-kube-api-access-mxhg2" (OuterVolumeSpecName: "kube-api-access-mxhg2") pod "1d2ce001-5523-44c3-b911-47c3f44ffb77" (UID: "1d2ce001-5523-44c3-b911-47c3f44ffb77"). InnerVolumeSpecName "kube-api-access-mxhg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.444548 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-inventory" (OuterVolumeSpecName: "inventory") pod "1d2ce001-5523-44c3-b911-47c3f44ffb77" (UID: "1d2ce001-5523-44c3-b911-47c3f44ffb77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.445249 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1d2ce001-5523-44c3-b911-47c3f44ffb77" (UID: "1d2ce001-5523-44c3-b911-47c3f44ffb77"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.447744 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "1d2ce001-5523-44c3-b911-47c3f44ffb77" (UID: "1d2ce001-5523-44c3-b911-47c3f44ffb77"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.454151 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "1d2ce001-5523-44c3-b911-47c3f44ffb77" (UID: "1d2ce001-5523-44c3-b911-47c3f44ffb77"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.473992 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "1d2ce001-5523-44c3-b911-47c3f44ffb77" (UID: "1d2ce001-5523-44c3-b911-47c3f44ffb77"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.527733 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.527770 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.527784 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.527796 4878 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.527810 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.527823 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d2ce001-5523-44c3-b911-47c3f44ffb77-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.527838 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxhg2\" (UniqueName: \"kubernetes.io/projected/1d2ce001-5523-44c3-b911-47c3f44ffb77-kube-api-access-mxhg2\") on node \"crc\" DevicePath \"\"" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.686959 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" event={"ID":"1d2ce001-5523-44c3-b911-47c3f44ffb77","Type":"ContainerDied","Data":"8741830a0d3b2ab313f425a7d2a1926709d4d1731f81ceeb8c04f080a62044ca"} Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.687229 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8741830a0d3b2ab313f425a7d2a1926709d4d1731f81ceeb8c04f080a62044ca" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.687130 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.837640 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt"] Dec 02 19:05:28 crc kubenswrapper[4878]: E1202 19:05:28.838163 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2ce001-5523-44c3-b911-47c3f44ffb77" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.838183 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2ce001-5523-44c3-b911-47c3f44ffb77" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.838512 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2ce001-5523-44c3-b911-47c3f44ffb77" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.839430 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.842040 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.842358 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.842627 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.842820 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.843127 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.853432 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt"] Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.936400 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.936457 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.936624 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.936650 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.936684 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n67m7\" (UniqueName: \"kubernetes.io/projected/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-kube-api-access-n67m7\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.937074 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:28 crc kubenswrapper[4878]: I1202 19:05:28.937443 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.038942 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.039021 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.039053 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.039126 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n67m7\" (UniqueName: \"kubernetes.io/projected/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-kube-api-access-n67m7\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.039962 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.040403 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.040577 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.043054 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.043995 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.045142 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.045274 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.045576 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.045698 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.061138 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n67m7\" (UniqueName: \"kubernetes.io/projected/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-kube-api-access-n67m7\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.199530 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:05:29 crc kubenswrapper[4878]: W1202 19:05:29.801399 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58ff5ba7_481c_48d8_bf39_0eb6665f23d7.slice/crio-5d253ce7fff66b82ab4dd43aa723af0b6497e3e65329c0d6c12a8cf5723d9c96 WatchSource:0}: Error finding container 5d253ce7fff66b82ab4dd43aa723af0b6497e3e65329c0d6c12a8cf5723d9c96: Status 404 returned error can't find the container with id 5d253ce7fff66b82ab4dd43aa723af0b6497e3e65329c0d6c12a8cf5723d9c96 Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.803778 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:05:29 crc kubenswrapper[4878]: I1202 19:05:29.808065 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt"] Dec 02 19:05:30 crc kubenswrapper[4878]: I1202 19:05:30.722550 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" event={"ID":"58ff5ba7-481c-48d8-bf39-0eb6665f23d7","Type":"ContainerStarted","Data":"5d253ce7fff66b82ab4dd43aa723af0b6497e3e65329c0d6c12a8cf5723d9c96"} Dec 02 19:05:31 crc kubenswrapper[4878]: I1202 19:05:31.774868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" event={"ID":"58ff5ba7-481c-48d8-bf39-0eb6665f23d7","Type":"ContainerStarted","Data":"cec337e557f5cb3a2a1b50b1766a1b2b1575d6122f0c1cf3476224744e25bdd7"} Dec 02 19:05:31 crc kubenswrapper[4878]: I1202 19:05:31.827521 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" podStartSLOduration=3.091276339 podStartE2EDuration="3.827498177s" podCreationTimestamp="2025-12-02 19:05:28 +0000 UTC" firstStartedPulling="2025-12-02 19:05:29.803554195 +0000 UTC m=+3039.493173076" lastFinishedPulling="2025-12-02 19:05:30.539775993 +0000 UTC m=+3040.229394914" observedRunningTime="2025-12-02 19:05:31.799029508 +0000 UTC m=+3041.488648389" watchObservedRunningTime="2025-12-02 19:05:31.827498177 +0000 UTC m=+3041.517117068" Dec 02 19:05:53 crc kubenswrapper[4878]: I1202 19:05:53.742340 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:05:53 crc kubenswrapper[4878]: I1202 19:05:53.742785 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:06:23 crc kubenswrapper[4878]: I1202 19:06:23.742165 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:06:23 crc kubenswrapper[4878]: I1202 19:06:23.743007 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.742469 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.743405 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.743478 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.744774 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.744858 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" gracePeriod=600 Dec 02 19:06:53 crc kubenswrapper[4878]: E1202 19:06:53.878635 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.912802 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" exitCode=0 Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.912868 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9"} Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.912915 4878 scope.go:117] "RemoveContainer" containerID="ad7be7ffe092b0b37b5f8e2f8bf030483233a5e9f6f947235c408c30f8fb13db" Dec 02 19:06:53 crc kubenswrapper[4878]: I1202 19:06:53.913705 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:06:53 crc kubenswrapper[4878]: E1202 19:06:53.914202 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:07:04 crc kubenswrapper[4878]: I1202 19:07:04.937863 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:07:04 crc kubenswrapper[4878]: E1202 19:07:04.938858 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:07:19 crc kubenswrapper[4878]: I1202 19:07:19.938977 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:07:19 crc kubenswrapper[4878]: E1202 19:07:19.940279 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:07:30 crc kubenswrapper[4878]: I1202 19:07:30.413983 4878 generic.go:334] "Generic (PLEG): container finished" podID="58ff5ba7-481c-48d8-bf39-0eb6665f23d7" containerID="cec337e557f5cb3a2a1b50b1766a1b2b1575d6122f0c1cf3476224744e25bdd7" exitCode=0 Dec 02 19:07:30 crc kubenswrapper[4878]: I1202 19:07:30.414087 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" event={"ID":"58ff5ba7-481c-48d8-bf39-0eb6665f23d7","Type":"ContainerDied","Data":"cec337e557f5cb3a2a1b50b1766a1b2b1575d6122f0c1cf3476224744e25bdd7"} Dec 02 19:07:31 crc kubenswrapper[4878]: I1202 19:07:31.914009 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.106851 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ssh-key\") pod \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.106985 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-inventory\") pod \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.107079 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-0\") pod \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.107157 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-telemetry-power-monitoring-combined-ca-bundle\") pod \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.107228 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-1\") pod \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.107337 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n67m7\" (UniqueName: \"kubernetes.io/projected/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-kube-api-access-n67m7\") pod \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.107375 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-2\") pod \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\" (UID: \"58ff5ba7-481c-48d8-bf39-0eb6665f23d7\") " Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.114007 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-kube-api-access-n67m7" (OuterVolumeSpecName: "kube-api-access-n67m7") pod "58ff5ba7-481c-48d8-bf39-0eb6665f23d7" (UID: "58ff5ba7-481c-48d8-bf39-0eb6665f23d7"). InnerVolumeSpecName "kube-api-access-n67m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.132005 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "58ff5ba7-481c-48d8-bf39-0eb6665f23d7" (UID: "58ff5ba7-481c-48d8-bf39-0eb6665f23d7"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.157022 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "58ff5ba7-481c-48d8-bf39-0eb6665f23d7" (UID: "58ff5ba7-481c-48d8-bf39-0eb6665f23d7"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.165736 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "58ff5ba7-481c-48d8-bf39-0eb6665f23d7" (UID: "58ff5ba7-481c-48d8-bf39-0eb6665f23d7"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.166691 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-inventory" (OuterVolumeSpecName: "inventory") pod "58ff5ba7-481c-48d8-bf39-0eb6665f23d7" (UID: "58ff5ba7-481c-48d8-bf39-0eb6665f23d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.178497 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "58ff5ba7-481c-48d8-bf39-0eb6665f23d7" (UID: "58ff5ba7-481c-48d8-bf39-0eb6665f23d7"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.180319 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "58ff5ba7-481c-48d8-bf39-0eb6665f23d7" (UID: "58ff5ba7-481c-48d8-bf39-0eb6665f23d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.211359 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.211418 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.211449 4878 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.211476 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.211498 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n67m7\" (UniqueName: \"kubernetes.io/projected/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-kube-api-access-n67m7\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.211518 4878 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.211538 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/58ff5ba7-481c-48d8-bf39-0eb6665f23d7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.437330 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" event={"ID":"58ff5ba7-481c-48d8-bf39-0eb6665f23d7","Type":"ContainerDied","Data":"5d253ce7fff66b82ab4dd43aa723af0b6497e3e65329c0d6c12a8cf5723d9c96"} Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.437390 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d253ce7fff66b82ab4dd43aa723af0b6497e3e65329c0d6c12a8cf5723d9c96" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.437390 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.548094 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9"] Dec 02 19:07:32 crc kubenswrapper[4878]: E1202 19:07:32.548963 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ff5ba7-481c-48d8-bf39-0eb6665f23d7" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.549165 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ff5ba7-481c-48d8-bf39-0eb6665f23d7" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.549639 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ff5ba7-481c-48d8-bf39-0eb6665f23d7" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.551715 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.563768 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9"] Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.565778 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.565786 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.566150 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.566320 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.568089 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-szvc8" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.630478 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.630920 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.630965 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.631473 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.631565 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8rq\" (UniqueName: \"kubernetes.io/projected/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-kube-api-access-jc8rq\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.732876 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.732922 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8rq\" (UniqueName: \"kubernetes.io/projected/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-kube-api-access-jc8rq\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.733025 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.733092 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.733125 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.737966 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.738227 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.738909 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.739646 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.749143 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8rq\" (UniqueName: \"kubernetes.io/projected/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-kube-api-access-jc8rq\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vqdv9\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.901913 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:32 crc kubenswrapper[4878]: I1202 19:07:32.938660 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:07:32 crc kubenswrapper[4878]: E1202 19:07:32.938946 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:07:33 crc kubenswrapper[4878]: I1202 19:07:33.554527 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9"] Dec 02 19:07:34 crc kubenswrapper[4878]: I1202 19:07:34.462337 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" event={"ID":"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6","Type":"ContainerStarted","Data":"b9c5757dd1468c7e425ea07a6d6c77670d46b0899079abe996c3ee0519d19b41"} Dec 02 19:07:35 crc kubenswrapper[4878]: I1202 19:07:35.480112 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" event={"ID":"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6","Type":"ContainerStarted","Data":"9bba573a86cec26926503dc6bc4b6857c31a72ca2e3c4419cfc9371293b6912d"} Dec 02 19:07:35 crc kubenswrapper[4878]: I1202 19:07:35.524987 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" podStartSLOduration=2.968641082 podStartE2EDuration="3.524965267s" podCreationTimestamp="2025-12-02 19:07:32 +0000 UTC" firstStartedPulling="2025-12-02 19:07:33.550420687 +0000 UTC m=+3163.240039608" lastFinishedPulling="2025-12-02 19:07:34.106744922 +0000 UTC m=+3163.796363793" observedRunningTime="2025-12-02 19:07:35.513472719 +0000 UTC m=+3165.203091640" watchObservedRunningTime="2025-12-02 19:07:35.524965267 +0000 UTC m=+3165.214584158" Dec 02 19:07:43 crc kubenswrapper[4878]: I1202 19:07:43.938789 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:07:43 crc kubenswrapper[4878]: E1202 19:07:43.939782 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:07:49 crc kubenswrapper[4878]: I1202 19:07:49.666458 4878 generic.go:334] "Generic (PLEG): container finished" podID="a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" containerID="9bba573a86cec26926503dc6bc4b6857c31a72ca2e3c4419cfc9371293b6912d" exitCode=0 Dec 02 19:07:49 crc kubenswrapper[4878]: I1202 19:07:49.666591 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" event={"ID":"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6","Type":"ContainerDied","Data":"9bba573a86cec26926503dc6bc4b6857c31a72ca2e3c4419cfc9371293b6912d"} Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.156818 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.174218 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-1\") pod \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.174386 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-inventory\") pod \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.174424 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8rq\" (UniqueName: \"kubernetes.io/projected/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-kube-api-access-jc8rq\") pod \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.174518 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-0\") pod \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.174593 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-ssh-key\") pod \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\" (UID: \"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6\") " Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.180912 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-kube-api-access-jc8rq" (OuterVolumeSpecName: "kube-api-access-jc8rq") pod "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" (UID: "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6"). InnerVolumeSpecName "kube-api-access-jc8rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.207162 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" (UID: "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.215534 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" (UID: "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.216297 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" (UID: "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.223873 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-inventory" (OuterVolumeSpecName: "inventory") pod "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" (UID: "a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.277099 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.277137 4878 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.277151 4878 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.277167 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8rq\" (UniqueName: \"kubernetes.io/projected/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-kube-api-access-jc8rq\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.277176 4878 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.684579 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" event={"ID":"a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6","Type":"ContainerDied","Data":"b9c5757dd1468c7e425ea07a6d6c77670d46b0899079abe996c3ee0519d19b41"} Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.684619 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c5757dd1468c7e425ea07a6d6c77670d46b0899079abe996c3ee0519d19b41" Dec 02 19:07:51 crc kubenswrapper[4878]: I1202 19:07:51.684622 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vqdv9" Dec 02 19:07:55 crc kubenswrapper[4878]: I1202 19:07:55.939098 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:07:55 crc kubenswrapper[4878]: E1202 19:07:55.939872 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:08:08 crc kubenswrapper[4878]: I1202 19:08:08.946950 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:08:08 crc kubenswrapper[4878]: E1202 19:08:08.949451 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.431792 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7wmkz"] Dec 02 19:08:18 crc kubenswrapper[4878]: E1202 19:08:18.435976 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.435996 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.436261 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.437924 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.468104 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7wmkz"] Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.479111 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-catalog-content\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.479391 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-utilities\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.479451 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggzv\" (UniqueName: \"kubernetes.io/projected/fa50b182-ab70-46d6-92d8-6329c532d5f1-kube-api-access-bggzv\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.581538 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-utilities\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.581600 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggzv\" (UniqueName: \"kubernetes.io/projected/fa50b182-ab70-46d6-92d8-6329c532d5f1-kube-api-access-bggzv\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.581781 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-catalog-content\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.582132 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-utilities\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.582301 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-catalog-content\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.606531 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggzv\" (UniqueName: \"kubernetes.io/projected/fa50b182-ab70-46d6-92d8-6329c532d5f1-kube-api-access-bggzv\") pod \"redhat-operators-7wmkz\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:18 crc kubenswrapper[4878]: I1202 19:08:18.771833 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.049047 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n6fnb"] Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.052006 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.076478 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6fnb"] Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.093818 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-utilities\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.093877 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcvxk\" (UniqueName: \"kubernetes.io/projected/cd70cd0e-1a96-4983-a614-34113a9416f2-kube-api-access-pcvxk\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.093964 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-catalog-content\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.196318 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-utilities\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.196373 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcvxk\" (UniqueName: \"kubernetes.io/projected/cd70cd0e-1a96-4983-a614-34113a9416f2-kube-api-access-pcvxk\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.196454 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-catalog-content\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.196804 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-utilities\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.196921 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-catalog-content\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.216677 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcvxk\" (UniqueName: \"kubernetes.io/projected/cd70cd0e-1a96-4983-a614-34113a9416f2-kube-api-access-pcvxk\") pod \"community-operators-n6fnb\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.307902 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7wmkz"] Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.388864 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:19 crc kubenswrapper[4878]: I1202 19:08:19.717731 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6fnb"] Dec 02 19:08:20 crc kubenswrapper[4878]: I1202 19:08:20.060030 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6fnb" event={"ID":"cd70cd0e-1a96-4983-a614-34113a9416f2","Type":"ContainerStarted","Data":"27c1bf2235d79e4c0efcec3751a7910059ccdf39e11134b01db4961dbbe3273e"} Dec 02 19:08:20 crc kubenswrapper[4878]: I1202 19:08:20.062406 4878 generic.go:334] "Generic (PLEG): container finished" podID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerID="dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d" exitCode=0 Dec 02 19:08:20 crc kubenswrapper[4878]: I1202 19:08:20.062470 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmkz" event={"ID":"fa50b182-ab70-46d6-92d8-6329c532d5f1","Type":"ContainerDied","Data":"dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d"} Dec 02 19:08:20 crc kubenswrapper[4878]: I1202 19:08:20.062492 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmkz" event={"ID":"fa50b182-ab70-46d6-92d8-6329c532d5f1","Type":"ContainerStarted","Data":"c24cd18762c4379c254e6bdeae2fc44e5066e2456a8aa00b52c92a223584d51d"} Dec 02 19:08:20 crc kubenswrapper[4878]: I1202 19:08:20.960314 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:08:20 crc kubenswrapper[4878]: E1202 19:08:20.960941 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:08:21 crc kubenswrapper[4878]: I1202 19:08:21.080177 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmkz" event={"ID":"fa50b182-ab70-46d6-92d8-6329c532d5f1","Type":"ContainerStarted","Data":"1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d"} Dec 02 19:08:21 crc kubenswrapper[4878]: I1202 19:08:21.084954 4878 generic.go:334] "Generic (PLEG): container finished" podID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerID="9aff54053cd3e7439cc9f6a427bf57b1af2d415ada32ffa95903c98752bfe5c7" exitCode=0 Dec 02 19:08:21 crc kubenswrapper[4878]: I1202 19:08:21.085015 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6fnb" event={"ID":"cd70cd0e-1a96-4983-a614-34113a9416f2","Type":"ContainerDied","Data":"9aff54053cd3e7439cc9f6a427bf57b1af2d415ada32ffa95903c98752bfe5c7"} Dec 02 19:08:23 crc kubenswrapper[4878]: I1202 19:08:23.117653 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6fnb" event={"ID":"cd70cd0e-1a96-4983-a614-34113a9416f2","Type":"ContainerStarted","Data":"1a5861ce2ac03f84d947fc550109dc2371ddda451a9e98770f6d5580feb1524a"} Dec 02 19:08:26 crc kubenswrapper[4878]: I1202 19:08:26.169803 4878 generic.go:334] "Generic (PLEG): container finished" podID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerID="1a5861ce2ac03f84d947fc550109dc2371ddda451a9e98770f6d5580feb1524a" exitCode=0 Dec 02 19:08:26 crc kubenswrapper[4878]: I1202 19:08:26.169903 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6fnb" event={"ID":"cd70cd0e-1a96-4983-a614-34113a9416f2","Type":"ContainerDied","Data":"1a5861ce2ac03f84d947fc550109dc2371ddda451a9e98770f6d5580feb1524a"} Dec 02 19:08:26 crc kubenswrapper[4878]: I1202 19:08:26.174213 4878 generic.go:334] "Generic (PLEG): container finished" podID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerID="1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d" exitCode=0 Dec 02 19:08:26 crc kubenswrapper[4878]: I1202 19:08:26.174286 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmkz" event={"ID":"fa50b182-ab70-46d6-92d8-6329c532d5f1","Type":"ContainerDied","Data":"1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d"} Dec 02 19:08:27 crc kubenswrapper[4878]: I1202 19:08:27.187576 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmkz" event={"ID":"fa50b182-ab70-46d6-92d8-6329c532d5f1","Type":"ContainerStarted","Data":"3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63"} Dec 02 19:08:27 crc kubenswrapper[4878]: I1202 19:08:27.218703 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7wmkz" podStartSLOduration=2.684115245 podStartE2EDuration="9.218686145s" podCreationTimestamp="2025-12-02 19:08:18 +0000 UTC" firstStartedPulling="2025-12-02 19:08:20.064152023 +0000 UTC m=+3209.753770904" lastFinishedPulling="2025-12-02 19:08:26.598722923 +0000 UTC m=+3216.288341804" observedRunningTime="2025-12-02 19:08:27.213962647 +0000 UTC m=+3216.903581548" watchObservedRunningTime="2025-12-02 19:08:27.218686145 +0000 UTC m=+3216.908305026" Dec 02 19:08:28 crc kubenswrapper[4878]: I1202 19:08:28.200699 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6fnb" event={"ID":"cd70cd0e-1a96-4983-a614-34113a9416f2","Type":"ContainerStarted","Data":"6842c29df92d99f7afd7b297a2157e5691ba99317a8312643c523d44d52df9c0"} Dec 02 19:08:28 crc kubenswrapper[4878]: I1202 19:08:28.222136 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n6fnb" podStartSLOduration=3.47991612 podStartE2EDuration="9.222114099s" podCreationTimestamp="2025-12-02 19:08:19 +0000 UTC" firstStartedPulling="2025-12-02 19:08:21.093771215 +0000 UTC m=+3210.783390096" lastFinishedPulling="2025-12-02 19:08:26.835969194 +0000 UTC m=+3216.525588075" observedRunningTime="2025-12-02 19:08:28.219949381 +0000 UTC m=+3217.909568282" watchObservedRunningTime="2025-12-02 19:08:28.222114099 +0000 UTC m=+3217.911732990" Dec 02 19:08:28 crc kubenswrapper[4878]: I1202 19:08:28.773023 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:28 crc kubenswrapper[4878]: I1202 19:08:28.773361 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:29 crc kubenswrapper[4878]: I1202 19:08:29.389599 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:29 crc kubenswrapper[4878]: I1202 19:08:29.389888 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:29 crc kubenswrapper[4878]: I1202 19:08:29.824545 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7wmkz" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="registry-server" probeResult="failure" output=< Dec 02 19:08:29 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:08:29 crc kubenswrapper[4878]: > Dec 02 19:08:30 crc kubenswrapper[4878]: I1202 19:08:30.475579 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-n6fnb" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="registry-server" probeResult="failure" output=< Dec 02 19:08:30 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:08:30 crc kubenswrapper[4878]: > Dec 02 19:08:32 crc kubenswrapper[4878]: I1202 19:08:32.938848 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:08:32 crc kubenswrapper[4878]: E1202 19:08:32.939257 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:08:38 crc kubenswrapper[4878]: I1202 19:08:38.874779 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:38 crc kubenswrapper[4878]: I1202 19:08:38.961325 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:39 crc kubenswrapper[4878]: I1202 19:08:39.128120 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7wmkz"] Dec 02 19:08:39 crc kubenswrapper[4878]: I1202 19:08:39.464812 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:39 crc kubenswrapper[4878]: I1202 19:08:39.525548 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:40 crc kubenswrapper[4878]: I1202 19:08:40.347224 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7wmkz" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="registry-server" containerID="cri-o://3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63" gracePeriod=2 Dec 02 19:08:40 crc kubenswrapper[4878]: I1202 19:08:40.874572 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:40 crc kubenswrapper[4878]: I1202 19:08:40.993346 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-utilities\") pod \"fa50b182-ab70-46d6-92d8-6329c532d5f1\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " Dec 02 19:08:40 crc kubenswrapper[4878]: I1202 19:08:40.993413 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bggzv\" (UniqueName: \"kubernetes.io/projected/fa50b182-ab70-46d6-92d8-6329c532d5f1-kube-api-access-bggzv\") pod \"fa50b182-ab70-46d6-92d8-6329c532d5f1\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " Dec 02 19:08:40 crc kubenswrapper[4878]: I1202 19:08:40.993486 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-catalog-content\") pod \"fa50b182-ab70-46d6-92d8-6329c532d5f1\" (UID: \"fa50b182-ab70-46d6-92d8-6329c532d5f1\") " Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.003329 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-utilities" (OuterVolumeSpecName: "utilities") pod "fa50b182-ab70-46d6-92d8-6329c532d5f1" (UID: "fa50b182-ab70-46d6-92d8-6329c532d5f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.022495 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa50b182-ab70-46d6-92d8-6329c532d5f1-kube-api-access-bggzv" (OuterVolumeSpecName: "kube-api-access-bggzv") pod "fa50b182-ab70-46d6-92d8-6329c532d5f1" (UID: "fa50b182-ab70-46d6-92d8-6329c532d5f1"). InnerVolumeSpecName "kube-api-access-bggzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.099078 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.099169 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bggzv\" (UniqueName: \"kubernetes.io/projected/fa50b182-ab70-46d6-92d8-6329c532d5f1-kube-api-access-bggzv\") on node \"crc\" DevicePath \"\"" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.130948 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa50b182-ab70-46d6-92d8-6329c532d5f1" (UID: "fa50b182-ab70-46d6-92d8-6329c532d5f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.201595 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa50b182-ab70-46d6-92d8-6329c532d5f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.364305 4878 generic.go:334] "Generic (PLEG): container finished" podID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerID="3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63" exitCode=0 Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.364522 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmkz" event={"ID":"fa50b182-ab70-46d6-92d8-6329c532d5f1","Type":"ContainerDied","Data":"3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63"} Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.365173 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7wmkz" event={"ID":"fa50b182-ab70-46d6-92d8-6329c532d5f1","Type":"ContainerDied","Data":"c24cd18762c4379c254e6bdeae2fc44e5066e2456a8aa00b52c92a223584d51d"} Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.365229 4878 scope.go:117] "RemoveContainer" containerID="3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.364757 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7wmkz" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.410534 4878 scope.go:117] "RemoveContainer" containerID="1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.444216 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7wmkz"] Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.454816 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7wmkz"] Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.455573 4878 scope.go:117] "RemoveContainer" containerID="dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.549611 4878 scope.go:117] "RemoveContainer" containerID="3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63" Dec 02 19:08:41 crc kubenswrapper[4878]: E1202 19:08:41.552368 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63\": container with ID starting with 3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63 not found: ID does not exist" containerID="3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.552425 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63"} err="failed to get container status \"3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63\": rpc error: code = NotFound desc = could not find container \"3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63\": container with ID starting with 3491f389053f73a8b4ee549c389dc70ef3e94cdb070503cce931a436d5b37f63 not found: ID does not exist" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.552462 4878 scope.go:117] "RemoveContainer" containerID="1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d" Dec 02 19:08:41 crc kubenswrapper[4878]: E1202 19:08:41.553091 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d\": container with ID starting with 1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d not found: ID does not exist" containerID="1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.553151 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d"} err="failed to get container status \"1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d\": rpc error: code = NotFound desc = could not find container \"1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d\": container with ID starting with 1d1427613a1fe18443cc60c41820f9c9fe11be200b87655aaf61e977c2d42b0d not found: ID does not exist" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.553201 4878 scope.go:117] "RemoveContainer" containerID="dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d" Dec 02 19:08:41 crc kubenswrapper[4878]: E1202 19:08:41.554017 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d\": container with ID starting with dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d not found: ID does not exist" containerID="dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d" Dec 02 19:08:41 crc kubenswrapper[4878]: I1202 19:08:41.554092 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d"} err="failed to get container status \"dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d\": rpc error: code = NotFound desc = could not find container \"dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d\": container with ID starting with dea58cdea466a959dcf15a18c3bf2492683cafe7ced1900dabd60b6f0eefe46d not found: ID does not exist" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.126977 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6fnb"] Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.127297 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n6fnb" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="registry-server" containerID="cri-o://6842c29df92d99f7afd7b297a2157e5691ba99317a8312643c523d44d52df9c0" gracePeriod=2 Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.379292 4878 generic.go:334] "Generic (PLEG): container finished" podID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerID="6842c29df92d99f7afd7b297a2157e5691ba99317a8312643c523d44d52df9c0" exitCode=0 Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.379349 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6fnb" event={"ID":"cd70cd0e-1a96-4983-a614-34113a9416f2","Type":"ContainerDied","Data":"6842c29df92d99f7afd7b297a2157e5691ba99317a8312643c523d44d52df9c0"} Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.673321 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.850997 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcvxk\" (UniqueName: \"kubernetes.io/projected/cd70cd0e-1a96-4983-a614-34113a9416f2-kube-api-access-pcvxk\") pod \"cd70cd0e-1a96-4983-a614-34113a9416f2\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.851186 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-catalog-content\") pod \"cd70cd0e-1a96-4983-a614-34113a9416f2\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.851305 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-utilities\") pod \"cd70cd0e-1a96-4983-a614-34113a9416f2\" (UID: \"cd70cd0e-1a96-4983-a614-34113a9416f2\") " Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.852178 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-utilities" (OuterVolumeSpecName: "utilities") pod "cd70cd0e-1a96-4983-a614-34113a9416f2" (UID: "cd70cd0e-1a96-4983-a614-34113a9416f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.856968 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70cd0e-1a96-4983-a614-34113a9416f2-kube-api-access-pcvxk" (OuterVolumeSpecName: "kube-api-access-pcvxk") pod "cd70cd0e-1a96-4983-a614-34113a9416f2" (UID: "cd70cd0e-1a96-4983-a614-34113a9416f2"). InnerVolumeSpecName "kube-api-access-pcvxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.909863 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd70cd0e-1a96-4983-a614-34113a9416f2" (UID: "cd70cd0e-1a96-4983-a614-34113a9416f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.952383 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" path="/var/lib/kubelet/pods/fa50b182-ab70-46d6-92d8-6329c532d5f1/volumes" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.954258 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.954375 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd70cd0e-1a96-4983-a614-34113a9416f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:08:42 crc kubenswrapper[4878]: I1202 19:08:42.954434 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcvxk\" (UniqueName: \"kubernetes.io/projected/cd70cd0e-1a96-4983-a614-34113a9416f2-kube-api-access-pcvxk\") on node \"crc\" DevicePath \"\"" Dec 02 19:08:43 crc kubenswrapper[4878]: I1202 19:08:43.402900 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6fnb" event={"ID":"cd70cd0e-1a96-4983-a614-34113a9416f2","Type":"ContainerDied","Data":"27c1bf2235d79e4c0efcec3751a7910059ccdf39e11134b01db4961dbbe3273e"} Dec 02 19:08:43 crc kubenswrapper[4878]: I1202 19:08:43.403180 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6fnb" Dec 02 19:08:43 crc kubenswrapper[4878]: I1202 19:08:43.403380 4878 scope.go:117] "RemoveContainer" containerID="6842c29df92d99f7afd7b297a2157e5691ba99317a8312643c523d44d52df9c0" Dec 02 19:08:43 crc kubenswrapper[4878]: I1202 19:08:43.434409 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6fnb"] Dec 02 19:08:43 crc kubenswrapper[4878]: I1202 19:08:43.442198 4878 scope.go:117] "RemoveContainer" containerID="1a5861ce2ac03f84d947fc550109dc2371ddda451a9e98770f6d5580feb1524a" Dec 02 19:08:43 crc kubenswrapper[4878]: I1202 19:08:43.448468 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n6fnb"] Dec 02 19:08:43 crc kubenswrapper[4878]: I1202 19:08:43.475294 4878 scope.go:117] "RemoveContainer" containerID="9aff54053cd3e7439cc9f6a427bf57b1af2d415ada32ffa95903c98752bfe5c7" Dec 02 19:08:44 crc kubenswrapper[4878]: I1202 19:08:44.971493 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" path="/var/lib/kubelet/pods/cd70cd0e-1a96-4983-a614-34113a9416f2/volumes" Dec 02 19:08:45 crc kubenswrapper[4878]: E1202 19:08:45.500139 4878 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.159:48306->38.102.83.159:41745: read tcp 38.102.83.159:48306->38.102.83.159:41745: read: connection reset by peer Dec 02 19:08:45 crc kubenswrapper[4878]: E1202 19:08:45.500533 4878 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.159:48306->38.102.83.159:41745: write tcp 38.102.83.159:48306->38.102.83.159:41745: write: broken pipe Dec 02 19:08:45 crc kubenswrapper[4878]: I1202 19:08:45.939130 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:08:45 crc kubenswrapper[4878]: E1202 19:08:45.939422 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:08:56 crc kubenswrapper[4878]: I1202 19:08:56.938952 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:08:56 crc kubenswrapper[4878]: E1202 19:08:56.940019 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:09:07 crc kubenswrapper[4878]: I1202 19:09:07.938584 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:09:07 crc kubenswrapper[4878]: E1202 19:09:07.939359 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:09:19 crc kubenswrapper[4878]: I1202 19:09:19.938933 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:09:19 crc kubenswrapper[4878]: E1202 19:09:19.939915 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:09:32 crc kubenswrapper[4878]: I1202 19:09:32.937837 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:09:32 crc kubenswrapper[4878]: E1202 19:09:32.938684 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:09:47 crc kubenswrapper[4878]: I1202 19:09:47.938485 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:09:47 crc kubenswrapper[4878]: E1202 19:09:47.939465 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:10:00 crc kubenswrapper[4878]: I1202 19:10:00.946225 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:10:00 crc kubenswrapper[4878]: E1202 19:10:00.947008 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:10:15 crc kubenswrapper[4878]: I1202 19:10:15.939525 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:10:15 crc kubenswrapper[4878]: E1202 19:10:15.940343 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:10:28 crc kubenswrapper[4878]: I1202 19:10:28.938039 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:10:28 crc kubenswrapper[4878]: E1202 19:10:28.939169 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:10:42 crc kubenswrapper[4878]: I1202 19:10:42.938897 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:10:42 crc kubenswrapper[4878]: E1202 19:10:42.939953 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:10:57 crc kubenswrapper[4878]: I1202 19:10:57.938486 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:10:57 crc kubenswrapper[4878]: E1202 19:10:57.939318 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:11:08 crc kubenswrapper[4878]: I1202 19:11:08.937805 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:11:08 crc kubenswrapper[4878]: E1202 19:11:08.938551 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:11:19 crc kubenswrapper[4878]: I1202 19:11:19.937507 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:11:19 crc kubenswrapper[4878]: E1202 19:11:19.938432 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:11:34 crc kubenswrapper[4878]: I1202 19:11:34.937584 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:11:34 crc kubenswrapper[4878]: E1202 19:11:34.938391 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:11:49 crc kubenswrapper[4878]: I1202 19:11:49.939055 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:11:49 crc kubenswrapper[4878]: E1202 19:11:49.940121 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.807024 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-clpsh"] Dec 02 19:11:50 crc kubenswrapper[4878]: E1202 19:11:50.807672 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="registry-server" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.807693 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="registry-server" Dec 02 19:11:50 crc kubenswrapper[4878]: E1202 19:11:50.807709 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="extract-utilities" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.807718 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="extract-utilities" Dec 02 19:11:50 crc kubenswrapper[4878]: E1202 19:11:50.807754 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="extract-content" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.807766 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="extract-content" Dec 02 19:11:50 crc kubenswrapper[4878]: E1202 19:11:50.807780 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="extract-content" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.807788 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="extract-content" Dec 02 19:11:50 crc kubenswrapper[4878]: E1202 19:11:50.807807 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="extract-utilities" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.807817 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="extract-utilities" Dec 02 19:11:50 crc kubenswrapper[4878]: E1202 19:11:50.807845 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="registry-server" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.807853 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="registry-server" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.808260 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa50b182-ab70-46d6-92d8-6329c532d5f1" containerName="registry-server" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.808281 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd70cd0e-1a96-4983-a614-34113a9416f2" containerName="registry-server" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.810541 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.828225 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-clpsh"] Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.922104 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-utilities\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.922970 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-catalog-content\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:50 crc kubenswrapper[4878]: I1202 19:11:50.923537 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-672wv\" (UniqueName: \"kubernetes.io/projected/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-kube-api-access-672wv\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.027035 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-utilities\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.027089 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-catalog-content\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.027674 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-utilities\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.027749 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-catalog-content\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.027832 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-672wv\" (UniqueName: \"kubernetes.io/projected/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-kube-api-access-672wv\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.058352 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-672wv\" (UniqueName: \"kubernetes.io/projected/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-kube-api-access-672wv\") pod \"certified-operators-clpsh\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.149466 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.704797 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-clpsh"] Dec 02 19:11:51 crc kubenswrapper[4878]: I1202 19:11:51.908068 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clpsh" event={"ID":"d06d91d5-a9b0-459a-ab18-5c7b6d31a653","Type":"ContainerStarted","Data":"2aab1c1d4db0fd8e9a28f95037e14b62c675200a451f8a291fe4fd95030a1a2e"} Dec 02 19:11:52 crc kubenswrapper[4878]: I1202 19:11:52.923202 4878 generic.go:334] "Generic (PLEG): container finished" podID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerID="75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658" exitCode=0 Dec 02 19:11:52 crc kubenswrapper[4878]: I1202 19:11:52.923390 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clpsh" event={"ID":"d06d91d5-a9b0-459a-ab18-5c7b6d31a653","Type":"ContainerDied","Data":"75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658"} Dec 02 19:11:52 crc kubenswrapper[4878]: I1202 19:11:52.926439 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:11:53 crc kubenswrapper[4878]: I1202 19:11:53.935288 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clpsh" event={"ID":"d06d91d5-a9b0-459a-ab18-5c7b6d31a653","Type":"ContainerStarted","Data":"ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80"} Dec 02 19:11:54 crc kubenswrapper[4878]: I1202 19:11:54.954543 4878 generic.go:334] "Generic (PLEG): container finished" podID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerID="ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80" exitCode=0 Dec 02 19:11:54 crc kubenswrapper[4878]: I1202 19:11:54.957574 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clpsh" event={"ID":"d06d91d5-a9b0-459a-ab18-5c7b6d31a653","Type":"ContainerDied","Data":"ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80"} Dec 02 19:11:55 crc kubenswrapper[4878]: I1202 19:11:55.967279 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clpsh" event={"ID":"d06d91d5-a9b0-459a-ab18-5c7b6d31a653","Type":"ContainerStarted","Data":"3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3"} Dec 02 19:11:56 crc kubenswrapper[4878]: I1202 19:11:56.005259 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-clpsh" podStartSLOduration=3.598802531 podStartE2EDuration="6.005228325s" podCreationTimestamp="2025-12-02 19:11:50 +0000 UTC" firstStartedPulling="2025-12-02 19:11:52.926154436 +0000 UTC m=+3422.615773317" lastFinishedPulling="2025-12-02 19:11:55.33258021 +0000 UTC m=+3425.022199111" observedRunningTime="2025-12-02 19:11:55.993772568 +0000 UTC m=+3425.683391469" watchObservedRunningTime="2025-12-02 19:11:56.005228325 +0000 UTC m=+3425.694847206" Dec 02 19:12:01 crc kubenswrapper[4878]: I1202 19:12:01.150814 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:12:01 crc kubenswrapper[4878]: I1202 19:12:01.151458 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:12:01 crc kubenswrapper[4878]: I1202 19:12:01.214797 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:12:02 crc kubenswrapper[4878]: I1202 19:12:02.108224 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:12:02 crc kubenswrapper[4878]: I1202 19:12:02.187208 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-clpsh"] Dec 02 19:12:03 crc kubenswrapper[4878]: I1202 19:12:03.937853 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.060752 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-clpsh" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="registry-server" containerID="cri-o://3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3" gracePeriod=2 Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.803612 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.883654 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-672wv\" (UniqueName: \"kubernetes.io/projected/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-kube-api-access-672wv\") pod \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.883757 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-catalog-content\") pod \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.883968 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-utilities\") pod \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\" (UID: \"d06d91d5-a9b0-459a-ab18-5c7b6d31a653\") " Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.885383 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-utilities" (OuterVolumeSpecName: "utilities") pod "d06d91d5-a9b0-459a-ab18-5c7b6d31a653" (UID: "d06d91d5-a9b0-459a-ab18-5c7b6d31a653"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.912258 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-kube-api-access-672wv" (OuterVolumeSpecName: "kube-api-access-672wv") pod "d06d91d5-a9b0-459a-ab18-5c7b6d31a653" (UID: "d06d91d5-a9b0-459a-ab18-5c7b6d31a653"). InnerVolumeSpecName "kube-api-access-672wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.953682 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d06d91d5-a9b0-459a-ab18-5c7b6d31a653" (UID: "d06d91d5-a9b0-459a-ab18-5c7b6d31a653"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.988349 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-672wv\" (UniqueName: \"kubernetes.io/projected/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-kube-api-access-672wv\") on node \"crc\" DevicePath \"\"" Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.988382 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:12:04 crc kubenswrapper[4878]: I1202 19:12:04.988398 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d06d91d5-a9b0-459a-ab18-5c7b6d31a653-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.074820 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"5c5532413b17b93c2ef0874874fdac15f70b0bb5edef53e3a26e65d974a4719c"} Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.080413 4878 generic.go:334] "Generic (PLEG): container finished" podID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerID="3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3" exitCode=0 Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.080461 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clpsh" event={"ID":"d06d91d5-a9b0-459a-ab18-5c7b6d31a653","Type":"ContainerDied","Data":"3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3"} Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.080474 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clpsh" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.080500 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clpsh" event={"ID":"d06d91d5-a9b0-459a-ab18-5c7b6d31a653","Type":"ContainerDied","Data":"2aab1c1d4db0fd8e9a28f95037e14b62c675200a451f8a291fe4fd95030a1a2e"} Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.080520 4878 scope.go:117] "RemoveContainer" containerID="3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.144685 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-clpsh"] Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.161640 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-clpsh"] Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.171347 4878 scope.go:117] "RemoveContainer" containerID="ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.231590 4878 scope.go:117] "RemoveContainer" containerID="75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.263141 4878 scope.go:117] "RemoveContainer" containerID="3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3" Dec 02 19:12:05 crc kubenswrapper[4878]: E1202 19:12:05.263629 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3\": container with ID starting with 3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3 not found: ID does not exist" containerID="3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.263657 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3"} err="failed to get container status \"3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3\": rpc error: code = NotFound desc = could not find container \"3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3\": container with ID starting with 3e733ca48bf613d014cf1eacb0100fb2533a89d0ed15f1a93323c27249f405e3 not found: ID does not exist" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.263676 4878 scope.go:117] "RemoveContainer" containerID="ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80" Dec 02 19:12:05 crc kubenswrapper[4878]: E1202 19:12:05.264094 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80\": container with ID starting with ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80 not found: ID does not exist" containerID="ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.264116 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80"} err="failed to get container status \"ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80\": rpc error: code = NotFound desc = could not find container \"ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80\": container with ID starting with ee506eac2cb13980544d43bc9ec872d2091f95a9df89d5347f5f76434f000f80 not found: ID does not exist" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.264128 4878 scope.go:117] "RemoveContainer" containerID="75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658" Dec 02 19:12:05 crc kubenswrapper[4878]: E1202 19:12:05.264542 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658\": container with ID starting with 75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658 not found: ID does not exist" containerID="75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658" Dec 02 19:12:05 crc kubenswrapper[4878]: I1202 19:12:05.264562 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658"} err="failed to get container status \"75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658\": rpc error: code = NotFound desc = could not find container \"75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658\": container with ID starting with 75438fbcaff999222cd41d8adefbe1aee9b4ddcb0716d32633327a1a0726f658 not found: ID does not exist" Dec 02 19:12:06 crc kubenswrapper[4878]: I1202 19:12:06.962582 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" path="/var/lib/kubelet/pods/d06d91d5-a9b0-459a-ab18-5c7b6d31a653/volumes" Dec 02 19:14:23 crc kubenswrapper[4878]: I1202 19:14:23.742946 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:14:23 crc kubenswrapper[4878]: I1202 19:14:23.743678 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:14:53 crc kubenswrapper[4878]: I1202 19:14:53.742095 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:14:53 crc kubenswrapper[4878]: I1202 19:14:53.743334 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.195010 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl"] Dec 02 19:15:00 crc kubenswrapper[4878]: E1202 19:15:00.195865 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="registry-server" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.195879 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="registry-server" Dec 02 19:15:00 crc kubenswrapper[4878]: E1202 19:15:00.195908 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="extract-utilities" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.195916 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="extract-utilities" Dec 02 19:15:00 crc kubenswrapper[4878]: E1202 19:15:00.195972 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="extract-content" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.195978 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="extract-content" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.196173 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06d91d5-a9b0-459a-ab18-5c7b6d31a653" containerName="registry-server" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.196992 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.201108 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.201533 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.227552 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl"] Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.330659 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b65c1eb-b94e-4808-a886-ebf6d4452d04-secret-volume\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.330756 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8s4q\" (UniqueName: \"kubernetes.io/projected/2b65c1eb-b94e-4808-a886-ebf6d4452d04-kube-api-access-m8s4q\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.330789 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b65c1eb-b94e-4808-a886-ebf6d4452d04-config-volume\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.433475 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8s4q\" (UniqueName: \"kubernetes.io/projected/2b65c1eb-b94e-4808-a886-ebf6d4452d04-kube-api-access-m8s4q\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.433540 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b65c1eb-b94e-4808-a886-ebf6d4452d04-config-volume\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.433705 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b65c1eb-b94e-4808-a886-ebf6d4452d04-secret-volume\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.434724 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b65c1eb-b94e-4808-a886-ebf6d4452d04-config-volume\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.441449 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b65c1eb-b94e-4808-a886-ebf6d4452d04-secret-volume\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.453018 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8s4q\" (UniqueName: \"kubernetes.io/projected/2b65c1eb-b94e-4808-a886-ebf6d4452d04-kube-api-access-m8s4q\") pod \"collect-profiles-29411715-799pl\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:00 crc kubenswrapper[4878]: I1202 19:15:00.524567 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:01 crc kubenswrapper[4878]: W1202 19:15:01.002923 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b65c1eb_b94e_4808_a886_ebf6d4452d04.slice/crio-1aaf96b314a4b623d0f7c1bd8d129471b879666f56aef1bdbd81b1ac8957deae WatchSource:0}: Error finding container 1aaf96b314a4b623d0f7c1bd8d129471b879666f56aef1bdbd81b1ac8957deae: Status 404 returned error can't find the container with id 1aaf96b314a4b623d0f7c1bd8d129471b879666f56aef1bdbd81b1ac8957deae Dec 02 19:15:01 crc kubenswrapper[4878]: I1202 19:15:01.005170 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl"] Dec 02 19:15:01 crc kubenswrapper[4878]: I1202 19:15:01.401841 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" event={"ID":"2b65c1eb-b94e-4808-a886-ebf6d4452d04","Type":"ContainerStarted","Data":"f2b5a0d9b5a0464eef32d58a5307e1853a054205d673e8a5780d5e07f056529d"} Dec 02 19:15:01 crc kubenswrapper[4878]: I1202 19:15:01.401906 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" event={"ID":"2b65c1eb-b94e-4808-a886-ebf6d4452d04","Type":"ContainerStarted","Data":"1aaf96b314a4b623d0f7c1bd8d129471b879666f56aef1bdbd81b1ac8957deae"} Dec 02 19:15:01 crc kubenswrapper[4878]: I1202 19:15:01.423838 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" podStartSLOduration=1.423817863 podStartE2EDuration="1.423817863s" podCreationTimestamp="2025-12-02 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:15:01.419948662 +0000 UTC m=+3611.109567553" watchObservedRunningTime="2025-12-02 19:15:01.423817863 +0000 UTC m=+3611.113436744" Dec 02 19:15:02 crc kubenswrapper[4878]: I1202 19:15:02.410907 4878 generic.go:334] "Generic (PLEG): container finished" podID="2b65c1eb-b94e-4808-a886-ebf6d4452d04" containerID="f2b5a0d9b5a0464eef32d58a5307e1853a054205d673e8a5780d5e07f056529d" exitCode=0 Dec 02 19:15:02 crc kubenswrapper[4878]: I1202 19:15:02.411052 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" event={"ID":"2b65c1eb-b94e-4808-a886-ebf6d4452d04","Type":"ContainerDied","Data":"f2b5a0d9b5a0464eef32d58a5307e1853a054205d673e8a5780d5e07f056529d"} Dec 02 19:15:03 crc kubenswrapper[4878]: I1202 19:15:03.871993 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:03 crc kubenswrapper[4878]: I1202 19:15:03.974580 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8s4q\" (UniqueName: \"kubernetes.io/projected/2b65c1eb-b94e-4808-a886-ebf6d4452d04-kube-api-access-m8s4q\") pod \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " Dec 02 19:15:03 crc kubenswrapper[4878]: I1202 19:15:03.974648 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b65c1eb-b94e-4808-a886-ebf6d4452d04-config-volume\") pod \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " Dec 02 19:15:03 crc kubenswrapper[4878]: I1202 19:15:03.974818 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b65c1eb-b94e-4808-a886-ebf6d4452d04-secret-volume\") pod \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\" (UID: \"2b65c1eb-b94e-4808-a886-ebf6d4452d04\") " Dec 02 19:15:03 crc kubenswrapper[4878]: I1202 19:15:03.978774 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b65c1eb-b94e-4808-a886-ebf6d4452d04-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b65c1eb-b94e-4808-a886-ebf6d4452d04" (UID: "2b65c1eb-b94e-4808-a886-ebf6d4452d04"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:15:03 crc kubenswrapper[4878]: I1202 19:15:03.983138 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b65c1eb-b94e-4808-a886-ebf6d4452d04-kube-api-access-m8s4q" (OuterVolumeSpecName: "kube-api-access-m8s4q") pod "2b65c1eb-b94e-4808-a886-ebf6d4452d04" (UID: "2b65c1eb-b94e-4808-a886-ebf6d4452d04"). InnerVolumeSpecName "kube-api-access-m8s4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:15:03 crc kubenswrapper[4878]: I1202 19:15:03.983607 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b65c1eb-b94e-4808-a886-ebf6d4452d04-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b65c1eb-b94e-4808-a886-ebf6d4452d04" (UID: "2b65c1eb-b94e-4808-a886-ebf6d4452d04"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.078867 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8s4q\" (UniqueName: \"kubernetes.io/projected/2b65c1eb-b94e-4808-a886-ebf6d4452d04-kube-api-access-m8s4q\") on node \"crc\" DevicePath \"\"" Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.078919 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b65c1eb-b94e-4808-a886-ebf6d4452d04-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.078933 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b65c1eb-b94e-4808-a886-ebf6d4452d04-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.434642 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" event={"ID":"2b65c1eb-b94e-4808-a886-ebf6d4452d04","Type":"ContainerDied","Data":"1aaf96b314a4b623d0f7c1bd8d129471b879666f56aef1bdbd81b1ac8957deae"} Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.434681 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aaf96b314a4b623d0f7c1bd8d129471b879666f56aef1bdbd81b1ac8957deae" Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.434720 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl" Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.507611 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg"] Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.522134 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411670-2bvmg"] Dec 02 19:15:04 crc kubenswrapper[4878]: I1202 19:15:04.966080 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06cccc00-e8b4-4afd-9a11-17c4ad2b3a72" path="/var/lib/kubelet/pods/06cccc00-e8b4-4afd-9a11-17c4ad2b3a72/volumes" Dec 02 19:15:23 crc kubenswrapper[4878]: I1202 19:15:23.742766 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:15:23 crc kubenswrapper[4878]: I1202 19:15:23.744274 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:15:23 crc kubenswrapper[4878]: I1202 19:15:23.744327 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:15:23 crc kubenswrapper[4878]: I1202 19:15:23.745655 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c5532413b17b93c2ef0874874fdac15f70b0bb5edef53e3a26e65d974a4719c"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:15:23 crc kubenswrapper[4878]: I1202 19:15:23.745723 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://5c5532413b17b93c2ef0874874fdac15f70b0bb5edef53e3a26e65d974a4719c" gracePeriod=600 Dec 02 19:15:24 crc kubenswrapper[4878]: I1202 19:15:24.765647 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="5c5532413b17b93c2ef0874874fdac15f70b0bb5edef53e3a26e65d974a4719c" exitCode=0 Dec 02 19:15:24 crc kubenswrapper[4878]: I1202 19:15:24.765727 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"5c5532413b17b93c2ef0874874fdac15f70b0bb5edef53e3a26e65d974a4719c"} Dec 02 19:15:24 crc kubenswrapper[4878]: I1202 19:15:24.766332 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc"} Dec 02 19:15:24 crc kubenswrapper[4878]: I1202 19:15:24.766354 4878 scope.go:117] "RemoveContainer" containerID="2c1d6775ced63b143fe053972111afa3f0a0304e51da8744f39480de69f166b9" Dec 02 19:16:04 crc kubenswrapper[4878]: I1202 19:16:04.883504 4878 scope.go:117] "RemoveContainer" containerID="ce97aa90ac3b941a39d3d0acd25361726db6928174291f6db5987db6446e59e8" Dec 02 19:17:53 crc kubenswrapper[4878]: I1202 19:17:53.742128 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:17:53 crc kubenswrapper[4878]: I1202 19:17:53.742824 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:18:23 crc kubenswrapper[4878]: I1202 19:18:23.742774 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:18:23 crc kubenswrapper[4878]: I1202 19:18:23.743566 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.064889 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ptcmv"] Dec 02 19:18:39 crc kubenswrapper[4878]: E1202 19:18:39.066123 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b65c1eb-b94e-4808-a886-ebf6d4452d04" containerName="collect-profiles" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.066140 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b65c1eb-b94e-4808-a886-ebf6d4452d04" containerName="collect-profiles" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.066433 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b65c1eb-b94e-4808-a886-ebf6d4452d04" containerName="collect-profiles" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.068624 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.076581 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ptcmv"] Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.192532 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-utilities\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.192857 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6k9n\" (UniqueName: \"kubernetes.io/projected/9efe9152-e844-43d7-abe8-2ba9edb42f8b-kube-api-access-v6k9n\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.193115 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-catalog-content\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.295329 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-utilities\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.295395 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6k9n\" (UniqueName: \"kubernetes.io/projected/9efe9152-e844-43d7-abe8-2ba9edb42f8b-kube-api-access-v6k9n\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.295498 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-catalog-content\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.295949 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-utilities\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.296012 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-catalog-content\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.321039 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6k9n\" (UniqueName: \"kubernetes.io/projected/9efe9152-e844-43d7-abe8-2ba9edb42f8b-kube-api-access-v6k9n\") pod \"community-operators-ptcmv\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.389605 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:39 crc kubenswrapper[4878]: I1202 19:18:39.992653 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ptcmv"] Dec 02 19:18:40 crc kubenswrapper[4878]: I1202 19:18:40.222360 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptcmv" event={"ID":"9efe9152-e844-43d7-abe8-2ba9edb42f8b","Type":"ContainerStarted","Data":"f45d75e7c8fb69d1e4b1f33ffead2b7099c5b101706da35308ba9409b03a33c2"} Dec 02 19:18:41 crc kubenswrapper[4878]: I1202 19:18:41.237812 4878 generic.go:334] "Generic (PLEG): container finished" podID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerID="80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e" exitCode=0 Dec 02 19:18:41 crc kubenswrapper[4878]: I1202 19:18:41.238005 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptcmv" event={"ID":"9efe9152-e844-43d7-abe8-2ba9edb42f8b","Type":"ContainerDied","Data":"80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e"} Dec 02 19:18:41 crc kubenswrapper[4878]: I1202 19:18:41.243321 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:18:41 crc kubenswrapper[4878]: E1202 19:18:41.291974 4878 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.159:33348->38.102.83.159:41745: read tcp 38.102.83.159:33348->38.102.83.159:41745: read: connection reset by peer Dec 02 19:18:43 crc kubenswrapper[4878]: I1202 19:18:43.260648 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptcmv" event={"ID":"9efe9152-e844-43d7-abe8-2ba9edb42f8b","Type":"ContainerStarted","Data":"0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2"} Dec 02 19:18:44 crc kubenswrapper[4878]: I1202 19:18:44.273205 4878 generic.go:334] "Generic (PLEG): container finished" podID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerID="0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2" exitCode=0 Dec 02 19:18:44 crc kubenswrapper[4878]: I1202 19:18:44.273284 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptcmv" event={"ID":"9efe9152-e844-43d7-abe8-2ba9edb42f8b","Type":"ContainerDied","Data":"0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2"} Dec 02 19:18:45 crc kubenswrapper[4878]: I1202 19:18:45.291913 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptcmv" event={"ID":"9efe9152-e844-43d7-abe8-2ba9edb42f8b","Type":"ContainerStarted","Data":"0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9"} Dec 02 19:18:45 crc kubenswrapper[4878]: I1202 19:18:45.314522 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ptcmv" podStartSLOduration=2.837718263 podStartE2EDuration="6.314502167s" podCreationTimestamp="2025-12-02 19:18:39 +0000 UTC" firstStartedPulling="2025-12-02 19:18:41.243079266 +0000 UTC m=+3830.932698137" lastFinishedPulling="2025-12-02 19:18:44.71986316 +0000 UTC m=+3834.409482041" observedRunningTime="2025-12-02 19:18:45.310931187 +0000 UTC m=+3835.000550068" watchObservedRunningTime="2025-12-02 19:18:45.314502167 +0000 UTC m=+3835.004121048" Dec 02 19:18:49 crc kubenswrapper[4878]: I1202 19:18:49.390996 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:49 crc kubenswrapper[4878]: I1202 19:18:49.391544 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:49 crc kubenswrapper[4878]: I1202 19:18:49.447399 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:50 crc kubenswrapper[4878]: I1202 19:18:50.408259 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:50 crc kubenswrapper[4878]: I1202 19:18:50.478587 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ptcmv"] Dec 02 19:18:52 crc kubenswrapper[4878]: I1202 19:18:52.368767 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ptcmv" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="registry-server" containerID="cri-o://0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9" gracePeriod=2 Dec 02 19:18:52 crc kubenswrapper[4878]: I1202 19:18:52.998854 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.078088 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-utilities\") pod \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.078184 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6k9n\" (UniqueName: \"kubernetes.io/projected/9efe9152-e844-43d7-abe8-2ba9edb42f8b-kube-api-access-v6k9n\") pod \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.078455 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-catalog-content\") pod \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\" (UID: \"9efe9152-e844-43d7-abe8-2ba9edb42f8b\") " Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.079445 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-utilities" (OuterVolumeSpecName: "utilities") pod "9efe9152-e844-43d7-abe8-2ba9edb42f8b" (UID: "9efe9152-e844-43d7-abe8-2ba9edb42f8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.079935 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.087644 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9efe9152-e844-43d7-abe8-2ba9edb42f8b-kube-api-access-v6k9n" (OuterVolumeSpecName: "kube-api-access-v6k9n") pod "9efe9152-e844-43d7-abe8-2ba9edb42f8b" (UID: "9efe9152-e844-43d7-abe8-2ba9edb42f8b"). InnerVolumeSpecName "kube-api-access-v6k9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.142495 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9efe9152-e844-43d7-abe8-2ba9edb42f8b" (UID: "9efe9152-e844-43d7-abe8-2ba9edb42f8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.183188 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6k9n\" (UniqueName: \"kubernetes.io/projected/9efe9152-e844-43d7-abe8-2ba9edb42f8b-kube-api-access-v6k9n\") on node \"crc\" DevicePath \"\"" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.183274 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9efe9152-e844-43d7-abe8-2ba9edb42f8b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.388264 4878 generic.go:334] "Generic (PLEG): container finished" podID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerID="0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9" exitCode=0 Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.388387 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ptcmv" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.388363 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptcmv" event={"ID":"9efe9152-e844-43d7-abe8-2ba9edb42f8b","Type":"ContainerDied","Data":"0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9"} Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.391119 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ptcmv" event={"ID":"9efe9152-e844-43d7-abe8-2ba9edb42f8b","Type":"ContainerDied","Data":"f45d75e7c8fb69d1e4b1f33ffead2b7099c5b101706da35308ba9409b03a33c2"} Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.391374 4878 scope.go:117] "RemoveContainer" containerID="0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.450062 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ptcmv"] Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.452202 4878 scope.go:117] "RemoveContainer" containerID="0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.467219 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ptcmv"] Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.487334 4878 scope.go:117] "RemoveContainer" containerID="80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.560165 4878 scope.go:117] "RemoveContainer" containerID="0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9" Dec 02 19:18:53 crc kubenswrapper[4878]: E1202 19:18:53.560751 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9\": container with ID starting with 0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9 not found: ID does not exist" containerID="0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.560819 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9"} err="failed to get container status \"0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9\": rpc error: code = NotFound desc = could not find container \"0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9\": container with ID starting with 0e75d4da8b0101aea8af2b7123c2b65ea7e7d41b00a082e1d1ac32487d36fcd9 not found: ID does not exist" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.560859 4878 scope.go:117] "RemoveContainer" containerID="0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2" Dec 02 19:18:53 crc kubenswrapper[4878]: E1202 19:18:53.561425 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2\": container with ID starting with 0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2 not found: ID does not exist" containerID="0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.561477 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2"} err="failed to get container status \"0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2\": rpc error: code = NotFound desc = could not find container \"0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2\": container with ID starting with 0c8c8ff599a27518f0d67eda762f31ad532eefe00151bd10fe0196d1c52c2ce2 not found: ID does not exist" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.561499 4878 scope.go:117] "RemoveContainer" containerID="80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e" Dec 02 19:18:53 crc kubenswrapper[4878]: E1202 19:18:53.562170 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e\": container with ID starting with 80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e not found: ID does not exist" containerID="80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.562218 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e"} err="failed to get container status \"80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e\": rpc error: code = NotFound desc = could not find container \"80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e\": container with ID starting with 80ae607369de9d65f725103a3b4af0636874269f434ff8acde6fac53531ba57e not found: ID does not exist" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.742341 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.742426 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.742489 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.743775 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:18:53 crc kubenswrapper[4878]: I1202 19:18:53.743887 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" gracePeriod=600 Dec 02 19:18:53 crc kubenswrapper[4878]: E1202 19:18:53.877776 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:18:54 crc kubenswrapper[4878]: I1202 19:18:54.416894 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" exitCode=0 Dec 02 19:18:54 crc kubenswrapper[4878]: I1202 19:18:54.416972 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc"} Dec 02 19:18:54 crc kubenswrapper[4878]: I1202 19:18:54.417381 4878 scope.go:117] "RemoveContainer" containerID="5c5532413b17b93c2ef0874874fdac15f70b0bb5edef53e3a26e65d974a4719c" Dec 02 19:18:54 crc kubenswrapper[4878]: I1202 19:18:54.418490 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:18:54 crc kubenswrapper[4878]: E1202 19:18:54.418990 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:18:54 crc kubenswrapper[4878]: I1202 19:18:54.957526 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" path="/var/lib/kubelet/pods/9efe9152-e844-43d7-abe8-2ba9edb42f8b/volumes" Dec 02 19:19:05 crc kubenswrapper[4878]: I1202 19:19:05.938179 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:19:05 crc kubenswrapper[4878]: E1202 19:19:05.939120 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.469650 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cmjbj"] Dec 02 19:19:17 crc kubenswrapper[4878]: E1202 19:19:17.470764 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="extract-content" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.470780 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="extract-content" Dec 02 19:19:17 crc kubenswrapper[4878]: E1202 19:19:17.470806 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="extract-utilities" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.470812 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="extract-utilities" Dec 02 19:19:17 crc kubenswrapper[4878]: E1202 19:19:17.470840 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="registry-server" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.470846 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="registry-server" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.471304 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efe9152-e844-43d7-abe8-2ba9edb42f8b" containerName="registry-server" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.473389 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.495527 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cmjbj"] Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.584086 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-catalog-content\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.584158 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-utilities\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.584302 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqd4b\" (UniqueName: \"kubernetes.io/projected/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-kube-api-access-lqd4b\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.686673 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-catalog-content\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.686768 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-utilities\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.686863 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqd4b\" (UniqueName: \"kubernetes.io/projected/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-kube-api-access-lqd4b\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.687139 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-catalog-content\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.687447 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-utilities\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.710870 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqd4b\" (UniqueName: \"kubernetes.io/projected/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-kube-api-access-lqd4b\") pod \"redhat-operators-cmjbj\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:17 crc kubenswrapper[4878]: I1202 19:19:17.796813 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:18 crc kubenswrapper[4878]: I1202 19:19:18.317181 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cmjbj"] Dec 02 19:19:18 crc kubenswrapper[4878]: W1202 19:19:18.320917 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8685abf9_4df0_4a6a_aa28_c39cc8b0cda1.slice/crio-c1890cc971820d7e01dd9b4da425739e036f00d2fe3b9f38c30082aebeaa6c08 WatchSource:0}: Error finding container c1890cc971820d7e01dd9b4da425739e036f00d2fe3b9f38c30082aebeaa6c08: Status 404 returned error can't find the container with id c1890cc971820d7e01dd9b4da425739e036f00d2fe3b9f38c30082aebeaa6c08 Dec 02 19:19:18 crc kubenswrapper[4878]: I1202 19:19:18.710393 4878 generic.go:334] "Generic (PLEG): container finished" podID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerID="d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de" exitCode=0 Dec 02 19:19:18 crc kubenswrapper[4878]: I1202 19:19:18.710625 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjbj" event={"ID":"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1","Type":"ContainerDied","Data":"d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de"} Dec 02 19:19:18 crc kubenswrapper[4878]: I1202 19:19:18.710650 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjbj" event={"ID":"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1","Type":"ContainerStarted","Data":"c1890cc971820d7e01dd9b4da425739e036f00d2fe3b9f38c30082aebeaa6c08"} Dec 02 19:19:19 crc kubenswrapper[4878]: I1202 19:19:19.938393 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:19:19 crc kubenswrapper[4878]: E1202 19:19:19.939058 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:19:20 crc kubenswrapper[4878]: I1202 19:19:20.738158 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjbj" event={"ID":"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1","Type":"ContainerStarted","Data":"d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674"} Dec 02 19:19:23 crc kubenswrapper[4878]: I1202 19:19:23.788644 4878 generic.go:334] "Generic (PLEG): container finished" podID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerID="d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674" exitCode=0 Dec 02 19:19:23 crc kubenswrapper[4878]: I1202 19:19:23.788729 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjbj" event={"ID":"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1","Type":"ContainerDied","Data":"d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674"} Dec 02 19:19:24 crc kubenswrapper[4878]: I1202 19:19:24.806550 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjbj" event={"ID":"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1","Type":"ContainerStarted","Data":"931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c"} Dec 02 19:19:24 crc kubenswrapper[4878]: I1202 19:19:24.847684 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cmjbj" podStartSLOduration=2.314479887 podStartE2EDuration="7.847657101s" podCreationTimestamp="2025-12-02 19:19:17 +0000 UTC" firstStartedPulling="2025-12-02 19:19:18.712794719 +0000 UTC m=+3868.402413600" lastFinishedPulling="2025-12-02 19:19:24.245971903 +0000 UTC m=+3873.935590814" observedRunningTime="2025-12-02 19:19:24.839983252 +0000 UTC m=+3874.529602153" watchObservedRunningTime="2025-12-02 19:19:24.847657101 +0000 UTC m=+3874.537275982" Dec 02 19:19:27 crc kubenswrapper[4878]: I1202 19:19:27.797423 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:27 crc kubenswrapper[4878]: I1202 19:19:27.797676 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:28 crc kubenswrapper[4878]: I1202 19:19:28.851137 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cmjbj" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="registry-server" probeResult="failure" output=< Dec 02 19:19:28 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:19:28 crc kubenswrapper[4878]: > Dec 02 19:19:32 crc kubenswrapper[4878]: I1202 19:19:32.938863 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:19:32 crc kubenswrapper[4878]: E1202 19:19:32.941582 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:19:37 crc kubenswrapper[4878]: I1202 19:19:37.853788 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:37 crc kubenswrapper[4878]: I1202 19:19:37.926973 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:38 crc kubenswrapper[4878]: I1202 19:19:38.103432 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cmjbj"] Dec 02 19:19:38 crc kubenswrapper[4878]: I1202 19:19:38.973593 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cmjbj" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="registry-server" containerID="cri-o://931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c" gracePeriod=2 Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.557948 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.620715 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-utilities\") pod \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.620992 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqd4b\" (UniqueName: \"kubernetes.io/projected/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-kube-api-access-lqd4b\") pod \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.621031 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-catalog-content\") pod \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\" (UID: \"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1\") " Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.622487 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-utilities" (OuterVolumeSpecName: "utilities") pod "8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" (UID: "8685abf9-4df0-4a6a-aa28-c39cc8b0cda1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.629532 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-kube-api-access-lqd4b" (OuterVolumeSpecName: "kube-api-access-lqd4b") pod "8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" (UID: "8685abf9-4df0-4a6a-aa28-c39cc8b0cda1"). InnerVolumeSpecName "kube-api-access-lqd4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.718705 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" (UID: "8685abf9-4df0-4a6a-aa28-c39cc8b0cda1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.723074 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.723106 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.723116 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqd4b\" (UniqueName: \"kubernetes.io/projected/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1-kube-api-access-lqd4b\") on node \"crc\" DevicePath \"\"" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.989894 4878 generic.go:334] "Generic (PLEG): container finished" podID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerID="931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c" exitCode=0 Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.989960 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjbj" event={"ID":"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1","Type":"ContainerDied","Data":"931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c"} Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.989989 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmjbj" Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.990011 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmjbj" event={"ID":"8685abf9-4df0-4a6a-aa28-c39cc8b0cda1","Type":"ContainerDied","Data":"c1890cc971820d7e01dd9b4da425739e036f00d2fe3b9f38c30082aebeaa6c08"} Dec 02 19:19:39 crc kubenswrapper[4878]: I1202 19:19:39.990072 4878 scope.go:117] "RemoveContainer" containerID="931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.036932 4878 scope.go:117] "RemoveContainer" containerID="d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.063558 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cmjbj"] Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.079307 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cmjbj"] Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.100904 4878 scope.go:117] "RemoveContainer" containerID="d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.156881 4878 scope.go:117] "RemoveContainer" containerID="931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c" Dec 02 19:19:40 crc kubenswrapper[4878]: E1202 19:19:40.157951 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c\": container with ID starting with 931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c not found: ID does not exist" containerID="931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.157990 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c"} err="failed to get container status \"931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c\": rpc error: code = NotFound desc = could not find container \"931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c\": container with ID starting with 931367b1eb67db681a8884a5b928d6776b85518d04d9cd5d763cb01c91f0f55c not found: ID does not exist" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.158019 4878 scope.go:117] "RemoveContainer" containerID="d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674" Dec 02 19:19:40 crc kubenswrapper[4878]: E1202 19:19:40.158483 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674\": container with ID starting with d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674 not found: ID does not exist" containerID="d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.158510 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674"} err="failed to get container status \"d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674\": rpc error: code = NotFound desc = could not find container \"d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674\": container with ID starting with d0df9911a4c4184c75e5ba7a48f1ab928d5377996a7f851a38cc0bdc2e1b4674 not found: ID does not exist" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.158527 4878 scope.go:117] "RemoveContainer" containerID="d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de" Dec 02 19:19:40 crc kubenswrapper[4878]: E1202 19:19:40.158772 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de\": container with ID starting with d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de not found: ID does not exist" containerID="d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.158794 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de"} err="failed to get container status \"d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de\": rpc error: code = NotFound desc = could not find container \"d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de\": container with ID starting with d234544fc8162049f9eaeb8503f58127e91b5009664f93806e2a8d6745c1b1de not found: ID does not exist" Dec 02 19:19:40 crc kubenswrapper[4878]: I1202 19:19:40.956445 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" path="/var/lib/kubelet/pods/8685abf9-4df0-4a6a-aa28-c39cc8b0cda1/volumes" Dec 02 19:19:44 crc kubenswrapper[4878]: I1202 19:19:44.938052 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:19:44 crc kubenswrapper[4878]: E1202 19:19:44.938926 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:19:56 crc kubenswrapper[4878]: I1202 19:19:56.937670 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:19:56 crc kubenswrapper[4878]: E1202 19:19:56.938472 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:20:10 crc kubenswrapper[4878]: I1202 19:20:10.946974 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:20:10 crc kubenswrapper[4878]: E1202 19:20:10.947739 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:20:22 crc kubenswrapper[4878]: I1202 19:20:22.938813 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:20:22 crc kubenswrapper[4878]: E1202 19:20:22.939746 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:20:35 crc kubenswrapper[4878]: I1202 19:20:35.937832 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:20:35 crc kubenswrapper[4878]: E1202 19:20:35.939508 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:20:50 crc kubenswrapper[4878]: I1202 19:20:50.960528 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:20:50 crc kubenswrapper[4878]: E1202 19:20:50.962071 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:21:01 crc kubenswrapper[4878]: I1202 19:21:01.939826 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:21:01 crc kubenswrapper[4878]: E1202 19:21:01.940537 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.451508 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j9f6s"] Dec 02 19:21:07 crc kubenswrapper[4878]: E1202 19:21:07.452744 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="extract-utilities" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.452768 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="extract-utilities" Dec 02 19:21:07 crc kubenswrapper[4878]: E1202 19:21:07.452808 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="extract-content" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.452816 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="extract-content" Dec 02 19:21:07 crc kubenswrapper[4878]: E1202 19:21:07.452869 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="registry-server" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.452878 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="registry-server" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.453225 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8685abf9-4df0-4a6a-aa28-c39cc8b0cda1" containerName="registry-server" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.455701 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.471166 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9f6s"] Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.479867 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llsnr\" (UniqueName: \"kubernetes.io/projected/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-kube-api-access-llsnr\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.479971 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-utilities\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.480109 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-catalog-content\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.581709 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llsnr\" (UniqueName: \"kubernetes.io/projected/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-kube-api-access-llsnr\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.581829 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-utilities\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.581948 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-catalog-content\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.582508 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-utilities\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.582586 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-catalog-content\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.606171 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llsnr\" (UniqueName: \"kubernetes.io/projected/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-kube-api-access-llsnr\") pod \"redhat-marketplace-j9f6s\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:07 crc kubenswrapper[4878]: I1202 19:21:07.786964 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:08 crc kubenswrapper[4878]: I1202 19:21:08.300617 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9f6s"] Dec 02 19:21:09 crc kubenswrapper[4878]: I1202 19:21:09.135627 4878 generic.go:334] "Generic (PLEG): container finished" podID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerID="c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31" exitCode=0 Dec 02 19:21:09 crc kubenswrapper[4878]: I1202 19:21:09.135693 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9f6s" event={"ID":"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0","Type":"ContainerDied","Data":"c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31"} Dec 02 19:21:09 crc kubenswrapper[4878]: I1202 19:21:09.137069 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9f6s" event={"ID":"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0","Type":"ContainerStarted","Data":"67b37560b4d3c993b05f492ced3dfda92aa2abe250de8f951fabe26053ddb4ab"} Dec 02 19:21:11 crc kubenswrapper[4878]: I1202 19:21:11.165156 4878 generic.go:334] "Generic (PLEG): container finished" podID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerID="fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7" exitCode=0 Dec 02 19:21:11 crc kubenswrapper[4878]: I1202 19:21:11.165213 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9f6s" event={"ID":"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0","Type":"ContainerDied","Data":"fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7"} Dec 02 19:21:12 crc kubenswrapper[4878]: I1202 19:21:12.179007 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9f6s" event={"ID":"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0","Type":"ContainerStarted","Data":"b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128"} Dec 02 19:21:12 crc kubenswrapper[4878]: I1202 19:21:12.209858 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j9f6s" podStartSLOduration=2.68056058 podStartE2EDuration="5.209835497s" podCreationTimestamp="2025-12-02 19:21:07 +0000 UTC" firstStartedPulling="2025-12-02 19:21:09.138078845 +0000 UTC m=+3978.827697726" lastFinishedPulling="2025-12-02 19:21:11.667353762 +0000 UTC m=+3981.356972643" observedRunningTime="2025-12-02 19:21:12.195319685 +0000 UTC m=+3981.884938586" watchObservedRunningTime="2025-12-02 19:21:12.209835497 +0000 UTC m=+3981.899454388" Dec 02 19:21:14 crc kubenswrapper[4878]: I1202 19:21:14.937663 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:21:14 crc kubenswrapper[4878]: E1202 19:21:14.938477 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:21:17 crc kubenswrapper[4878]: I1202 19:21:17.787566 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:17 crc kubenswrapper[4878]: I1202 19:21:17.788282 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:17 crc kubenswrapper[4878]: I1202 19:21:17.869336 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:18 crc kubenswrapper[4878]: I1202 19:21:18.338778 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:18 crc kubenswrapper[4878]: I1202 19:21:18.402156 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9f6s"] Dec 02 19:21:20 crc kubenswrapper[4878]: I1202 19:21:20.296104 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j9f6s" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="registry-server" containerID="cri-o://b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128" gracePeriod=2 Dec 02 19:21:20 crc kubenswrapper[4878]: I1202 19:21:20.909023 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.010888 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-utilities\") pod \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.011266 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-catalog-content\") pod \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.011323 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llsnr\" (UniqueName: \"kubernetes.io/projected/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-kube-api-access-llsnr\") pod \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\" (UID: \"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0\") " Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.012033 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-utilities" (OuterVolumeSpecName: "utilities") pod "8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" (UID: "8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.015193 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.020547 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-kube-api-access-llsnr" (OuterVolumeSpecName: "kube-api-access-llsnr") pod "8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" (UID: "8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0"). InnerVolumeSpecName "kube-api-access-llsnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.033575 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" (UID: "8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.117355 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.117391 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llsnr\" (UniqueName: \"kubernetes.io/projected/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0-kube-api-access-llsnr\") on node \"crc\" DevicePath \"\"" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.308289 4878 generic.go:334] "Generic (PLEG): container finished" podID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerID="b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128" exitCode=0 Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.308329 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9f6s" event={"ID":"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0","Type":"ContainerDied","Data":"b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128"} Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.308353 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9f6s" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.308365 4878 scope.go:117] "RemoveContainer" containerID="b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.308355 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9f6s" event={"ID":"8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0","Type":"ContainerDied","Data":"67b37560b4d3c993b05f492ced3dfda92aa2abe250de8f951fabe26053ddb4ab"} Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.329643 4878 scope.go:117] "RemoveContainer" containerID="fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.344491 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9f6s"] Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.354642 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9f6s"] Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.379219 4878 scope.go:117] "RemoveContainer" containerID="c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.435223 4878 scope.go:117] "RemoveContainer" containerID="b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128" Dec 02 19:21:21 crc kubenswrapper[4878]: E1202 19:21:21.435645 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128\": container with ID starting with b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128 not found: ID does not exist" containerID="b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.435675 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128"} err="failed to get container status \"b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128\": rpc error: code = NotFound desc = could not find container \"b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128\": container with ID starting with b5382e28ee0a8d829935b1339d76ad5939f679917b8f7d5977c4e863b6924128 not found: ID does not exist" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.435699 4878 scope.go:117] "RemoveContainer" containerID="fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7" Dec 02 19:21:21 crc kubenswrapper[4878]: E1202 19:21:21.436076 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7\": container with ID starting with fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7 not found: ID does not exist" containerID="fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.436129 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7"} err="failed to get container status \"fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7\": rpc error: code = NotFound desc = could not find container \"fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7\": container with ID starting with fc4803b89c1dfdd887b66c90de96b3cfd216e4d00a49bc9e99957c6c5ff939d7 not found: ID does not exist" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.436165 4878 scope.go:117] "RemoveContainer" containerID="c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31" Dec 02 19:21:21 crc kubenswrapper[4878]: E1202 19:21:21.436462 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31\": container with ID starting with c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31 not found: ID does not exist" containerID="c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31" Dec 02 19:21:21 crc kubenswrapper[4878]: I1202 19:21:21.436493 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31"} err="failed to get container status \"c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31\": rpc error: code = NotFound desc = could not find container \"c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31\": container with ID starting with c66e51caa047c1c3f6edfba5ffa1fd51e2a55c323d7ef2754428585f8836aa31 not found: ID does not exist" Dec 02 19:21:22 crc kubenswrapper[4878]: I1202 19:21:22.954910 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" path="/var/lib/kubelet/pods/8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0/volumes" Dec 02 19:21:28 crc kubenswrapper[4878]: I1202 19:21:28.938293 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:21:28 crc kubenswrapper[4878]: E1202 19:21:28.939102 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:21:41 crc kubenswrapper[4878]: I1202 19:21:41.938478 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:21:41 crc kubenswrapper[4878]: E1202 19:21:41.939429 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:21:53 crc kubenswrapper[4878]: I1202 19:21:53.938222 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:21:53 crc kubenswrapper[4878]: E1202 19:21:53.939189 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:22:06 crc kubenswrapper[4878]: I1202 19:22:06.938167 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:22:06 crc kubenswrapper[4878]: E1202 19:22:06.939878 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:22:19 crc kubenswrapper[4878]: I1202 19:22:19.938341 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:22:19 crc kubenswrapper[4878]: E1202 19:22:19.939293 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:22:33 crc kubenswrapper[4878]: I1202 19:22:33.938956 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:22:33 crc kubenswrapper[4878]: E1202 19:22:33.940033 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:22:46 crc kubenswrapper[4878]: I1202 19:22:46.938391 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:22:46 crc kubenswrapper[4878]: E1202 19:22:46.939044 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:23:01 crc kubenswrapper[4878]: I1202 19:23:01.938216 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:23:01 crc kubenswrapper[4878]: E1202 19:23:01.939027 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.922046 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h4pr7"] Dec 02 19:23:03 crc kubenswrapper[4878]: E1202 19:23:03.923153 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="extract-content" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.923173 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="extract-content" Dec 02 19:23:03 crc kubenswrapper[4878]: E1202 19:23:03.923197 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="registry-server" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.923205 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="registry-server" Dec 02 19:23:03 crc kubenswrapper[4878]: E1202 19:23:03.923264 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="extract-utilities" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.923273 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="extract-utilities" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.923543 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8493e9f7-e9b9-4fa7-8567-8dc95cb08fd0" containerName="registry-server" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.925890 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.935081 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h4pr7"] Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.942380 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-utilities\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.943089 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-catalog-content\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:03 crc kubenswrapper[4878]: I1202 19:23:03.943863 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tb4f\" (UniqueName: \"kubernetes.io/projected/51fde784-46c2-4c2e-ab50-c19a6ff326e5-kube-api-access-6tb4f\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.045745 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-catalog-content\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.045905 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tb4f\" (UniqueName: \"kubernetes.io/projected/51fde784-46c2-4c2e-ab50-c19a6ff326e5-kube-api-access-6tb4f\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.046056 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-utilities\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.047432 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-catalog-content\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.047505 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-utilities\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.271558 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tb4f\" (UniqueName: \"kubernetes.io/projected/51fde784-46c2-4c2e-ab50-c19a6ff326e5-kube-api-access-6tb4f\") pod \"certified-operators-h4pr7\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.289785 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:04 crc kubenswrapper[4878]: I1202 19:23:04.920348 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h4pr7"] Dec 02 19:23:04 crc kubenswrapper[4878]: W1202 19:23:04.923518 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51fde784_46c2_4c2e_ab50_c19a6ff326e5.slice/crio-5512191e179fc7052c814bb626168ae0d7a6d52b7960110335c0fc2a9483b923 WatchSource:0}: Error finding container 5512191e179fc7052c814bb626168ae0d7a6d52b7960110335c0fc2a9483b923: Status 404 returned error can't find the container with id 5512191e179fc7052c814bb626168ae0d7a6d52b7960110335c0fc2a9483b923 Dec 02 19:23:05 crc kubenswrapper[4878]: I1202 19:23:05.727197 4878 generic.go:334] "Generic (PLEG): container finished" podID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerID="f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7" exitCode=0 Dec 02 19:23:05 crc kubenswrapper[4878]: I1202 19:23:05.727431 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4pr7" event={"ID":"51fde784-46c2-4c2e-ab50-c19a6ff326e5","Type":"ContainerDied","Data":"f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7"} Dec 02 19:23:05 crc kubenswrapper[4878]: I1202 19:23:05.728584 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4pr7" event={"ID":"51fde784-46c2-4c2e-ab50-c19a6ff326e5","Type":"ContainerStarted","Data":"5512191e179fc7052c814bb626168ae0d7a6d52b7960110335c0fc2a9483b923"} Dec 02 19:23:07 crc kubenswrapper[4878]: I1202 19:23:07.759429 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4pr7" event={"ID":"51fde784-46c2-4c2e-ab50-c19a6ff326e5","Type":"ContainerStarted","Data":"7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa"} Dec 02 19:23:08 crc kubenswrapper[4878]: I1202 19:23:08.775828 4878 generic.go:334] "Generic (PLEG): container finished" podID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerID="7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa" exitCode=0 Dec 02 19:23:08 crc kubenswrapper[4878]: I1202 19:23:08.775925 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4pr7" event={"ID":"51fde784-46c2-4c2e-ab50-c19a6ff326e5","Type":"ContainerDied","Data":"7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa"} Dec 02 19:23:09 crc kubenswrapper[4878]: I1202 19:23:09.791260 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4pr7" event={"ID":"51fde784-46c2-4c2e-ab50-c19a6ff326e5","Type":"ContainerStarted","Data":"326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d"} Dec 02 19:23:09 crc kubenswrapper[4878]: I1202 19:23:09.817033 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h4pr7" podStartSLOduration=3.082828033 podStartE2EDuration="6.817004572s" podCreationTimestamp="2025-12-02 19:23:03 +0000 UTC" firstStartedPulling="2025-12-02 19:23:05.731613435 +0000 UTC m=+4095.421232316" lastFinishedPulling="2025-12-02 19:23:09.465789954 +0000 UTC m=+4099.155408855" observedRunningTime="2025-12-02 19:23:09.812194002 +0000 UTC m=+4099.501812923" watchObservedRunningTime="2025-12-02 19:23:09.817004572 +0000 UTC m=+4099.506623493" Dec 02 19:23:14 crc kubenswrapper[4878]: I1202 19:23:14.290084 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:14 crc kubenswrapper[4878]: I1202 19:23:14.290879 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:14 crc kubenswrapper[4878]: I1202 19:23:14.369067 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:14 crc kubenswrapper[4878]: I1202 19:23:14.887656 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:14 crc kubenswrapper[4878]: I1202 19:23:14.955216 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h4pr7"] Dec 02 19:23:15 crc kubenswrapper[4878]: I1202 19:23:15.939189 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:23:15 crc kubenswrapper[4878]: E1202 19:23:15.940333 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:23:16 crc kubenswrapper[4878]: I1202 19:23:16.882160 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h4pr7" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="registry-server" containerID="cri-o://326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d" gracePeriod=2 Dec 02 19:23:16 crc kubenswrapper[4878]: E1202 19:23:16.960534 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51fde784_46c2_4c2e_ab50_c19a6ff326e5.slice/crio-326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d.scope\": RecentStats: unable to find data in memory cache]" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.415913 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.482757 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tb4f\" (UniqueName: \"kubernetes.io/projected/51fde784-46c2-4c2e-ab50-c19a6ff326e5-kube-api-access-6tb4f\") pod \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.482914 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-catalog-content\") pod \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.482974 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-utilities\") pod \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\" (UID: \"51fde784-46c2-4c2e-ab50-c19a6ff326e5\") " Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.483945 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-utilities" (OuterVolumeSpecName: "utilities") pod "51fde784-46c2-4c2e-ab50-c19a6ff326e5" (UID: "51fde784-46c2-4c2e-ab50-c19a6ff326e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.490488 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51fde784-46c2-4c2e-ab50-c19a6ff326e5-kube-api-access-6tb4f" (OuterVolumeSpecName: "kube-api-access-6tb4f") pod "51fde784-46c2-4c2e-ab50-c19a6ff326e5" (UID: "51fde784-46c2-4c2e-ab50-c19a6ff326e5"). InnerVolumeSpecName "kube-api-access-6tb4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.552058 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51fde784-46c2-4c2e-ab50-c19a6ff326e5" (UID: "51fde784-46c2-4c2e-ab50-c19a6ff326e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.585484 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tb4f\" (UniqueName: \"kubernetes.io/projected/51fde784-46c2-4c2e-ab50-c19a6ff326e5-kube-api-access-6tb4f\") on node \"crc\" DevicePath \"\"" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.585517 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.585527 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fde784-46c2-4c2e-ab50-c19a6ff326e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.898945 4878 generic.go:334] "Generic (PLEG): container finished" podID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerID="326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d" exitCode=0 Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.899020 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4pr7" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.899054 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4pr7" event={"ID":"51fde784-46c2-4c2e-ab50-c19a6ff326e5","Type":"ContainerDied","Data":"326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d"} Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.899454 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4pr7" event={"ID":"51fde784-46c2-4c2e-ab50-c19a6ff326e5","Type":"ContainerDied","Data":"5512191e179fc7052c814bb626168ae0d7a6d52b7960110335c0fc2a9483b923"} Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.899479 4878 scope.go:117] "RemoveContainer" containerID="326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.941906 4878 scope.go:117] "RemoveContainer" containerID="7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa" Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.951605 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h4pr7"] Dec 02 19:23:17 crc kubenswrapper[4878]: I1202 19:23:17.966857 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h4pr7"] Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.302354 4878 scope.go:117] "RemoveContainer" containerID="f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7" Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.370407 4878 scope.go:117] "RemoveContainer" containerID="326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d" Dec 02 19:23:18 crc kubenswrapper[4878]: E1202 19:23:18.371131 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d\": container with ID starting with 326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d not found: ID does not exist" containerID="326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d" Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.371181 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d"} err="failed to get container status \"326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d\": rpc error: code = NotFound desc = could not find container \"326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d\": container with ID starting with 326c23ef1c893be8d9ea1db973abeaff2dd3cd1faa003ce39c80ffc54d34db1d not found: ID does not exist" Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.371213 4878 scope.go:117] "RemoveContainer" containerID="7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa" Dec 02 19:23:18 crc kubenswrapper[4878]: E1202 19:23:18.371744 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa\": container with ID starting with 7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa not found: ID does not exist" containerID="7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa" Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.371783 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa"} err="failed to get container status \"7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa\": rpc error: code = NotFound desc = could not find container \"7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa\": container with ID starting with 7f844f6d86efadfc37a1b067f9cf970bb7da6133e31fa556ac3d1bc040e8d3aa not found: ID does not exist" Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.371814 4878 scope.go:117] "RemoveContainer" containerID="f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7" Dec 02 19:23:18 crc kubenswrapper[4878]: E1202 19:23:18.372249 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7\": container with ID starting with f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7 not found: ID does not exist" containerID="f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7" Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.372279 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7"} err="failed to get container status \"f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7\": rpc error: code = NotFound desc = could not find container \"f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7\": container with ID starting with f0c757623efe13d352eb55107ad201414bd3e438c0689aa02c583f231efd50c7 not found: ID does not exist" Dec 02 19:23:18 crc kubenswrapper[4878]: I1202 19:23:18.968077 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" path="/var/lib/kubelet/pods/51fde784-46c2-4c2e-ab50-c19a6ff326e5/volumes" Dec 02 19:23:26 crc kubenswrapper[4878]: I1202 19:23:26.939396 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:23:26 crc kubenswrapper[4878]: E1202 19:23:26.940747 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:23:38 crc kubenswrapper[4878]: I1202 19:23:38.938168 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:23:38 crc kubenswrapper[4878]: E1202 19:23:38.938879 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:23:52 crc kubenswrapper[4878]: I1202 19:23:52.938417 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:23:52 crc kubenswrapper[4878]: E1202 19:23:52.941186 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:24:05 crc kubenswrapper[4878]: I1202 19:24:05.939208 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:24:06 crc kubenswrapper[4878]: I1202 19:24:06.552696 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"cf6c6e19bfc472b390886743cf5b1b51ea9d0fa8608ca59cf167649361a49ada"} Dec 02 19:24:25 crc kubenswrapper[4878]: E1202 19:24:25.746532 4878 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.159:57024->38.102.83.159:41745: write tcp 38.102.83.159:57024->38.102.83.159:41745: write: connection reset by peer Dec 02 19:26:23 crc kubenswrapper[4878]: I1202 19:26:23.741827 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:26:23 crc kubenswrapper[4878]: I1202 19:26:23.743196 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:26:53 crc kubenswrapper[4878]: I1202 19:26:53.742611 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:26:53 crc kubenswrapper[4878]: I1202 19:26:53.743220 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:26:57 crc kubenswrapper[4878]: E1202 19:26:57.225077 4878 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.159:40724->38.102.83.159:41745: write tcp 38.102.83.159:40724->38.102.83.159:41745: write: broken pipe Dec 02 19:27:23 crc kubenswrapper[4878]: I1202 19:27:23.742219 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:27:23 crc kubenswrapper[4878]: I1202 19:27:23.742842 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:27:23 crc kubenswrapper[4878]: I1202 19:27:23.742922 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:27:23 crc kubenswrapper[4878]: I1202 19:27:23.743836 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf6c6e19bfc472b390886743cf5b1b51ea9d0fa8608ca59cf167649361a49ada"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:27:23 crc kubenswrapper[4878]: I1202 19:27:23.743901 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://cf6c6e19bfc472b390886743cf5b1b51ea9d0fa8608ca59cf167649361a49ada" gracePeriod=600 Dec 02 19:27:24 crc kubenswrapper[4878]: I1202 19:27:24.036356 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="cf6c6e19bfc472b390886743cf5b1b51ea9d0fa8608ca59cf167649361a49ada" exitCode=0 Dec 02 19:27:24 crc kubenswrapper[4878]: I1202 19:27:24.036418 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"cf6c6e19bfc472b390886743cf5b1b51ea9d0fa8608ca59cf167649361a49ada"} Dec 02 19:27:24 crc kubenswrapper[4878]: I1202 19:27:24.036705 4878 scope.go:117] "RemoveContainer" containerID="d151984d226064a6d811053b3885fcef5deaaa82a1fbea901deb67d730d38bcc" Dec 02 19:27:25 crc kubenswrapper[4878]: I1202 19:27:25.048732 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074"} Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.753758 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2rn8d"] Dec 02 19:28:50 crc kubenswrapper[4878]: E1202 19:28:50.754836 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="extract-content" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.754850 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="extract-content" Dec 02 19:28:50 crc kubenswrapper[4878]: E1202 19:28:50.754891 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="registry-server" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.754897 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="registry-server" Dec 02 19:28:50 crc kubenswrapper[4878]: E1202 19:28:50.754928 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="extract-utilities" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.754935 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="extract-utilities" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.755186 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="51fde784-46c2-4c2e-ab50-c19a6ff326e5" containerName="registry-server" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.757292 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.789387 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rn8d"] Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.887520 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-catalog-content\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.887818 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbfc\" (UniqueName: \"kubernetes.io/projected/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-kube-api-access-lwbfc\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.887976 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-utilities\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.990196 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-catalog-content\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.990284 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbfc\" (UniqueName: \"kubernetes.io/projected/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-kube-api-access-lwbfc\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.990356 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-utilities\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.990790 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-catalog-content\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:50 crc kubenswrapper[4878]: I1202 19:28:50.990893 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-utilities\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:51 crc kubenswrapper[4878]: I1202 19:28:51.043592 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbfc\" (UniqueName: \"kubernetes.io/projected/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-kube-api-access-lwbfc\") pod \"community-operators-2rn8d\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:51 crc kubenswrapper[4878]: I1202 19:28:51.088930 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:28:51 crc kubenswrapper[4878]: I1202 19:28:51.570470 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rn8d"] Dec 02 19:28:52 crc kubenswrapper[4878]: I1202 19:28:52.220776 4878 generic.go:334] "Generic (PLEG): container finished" podID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerID="9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da" exitCode=0 Dec 02 19:28:52 crc kubenswrapper[4878]: I1202 19:28:52.221016 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rn8d" event={"ID":"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5","Type":"ContainerDied","Data":"9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da"} Dec 02 19:28:52 crc kubenswrapper[4878]: I1202 19:28:52.221043 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rn8d" event={"ID":"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5","Type":"ContainerStarted","Data":"07e52286833baf3db700cb161ee9e7a9cab39cd9b9c3ae44c3860035bdc9bf14"} Dec 02 19:28:52 crc kubenswrapper[4878]: I1202 19:28:52.222907 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:28:54 crc kubenswrapper[4878]: I1202 19:28:54.242798 4878 generic.go:334] "Generic (PLEG): container finished" podID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerID="657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6" exitCode=0 Dec 02 19:28:54 crc kubenswrapper[4878]: I1202 19:28:54.242893 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rn8d" event={"ID":"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5","Type":"ContainerDied","Data":"657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6"} Dec 02 19:28:55 crc kubenswrapper[4878]: I1202 19:28:55.267287 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rn8d" event={"ID":"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5","Type":"ContainerStarted","Data":"5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770"} Dec 02 19:28:55 crc kubenswrapper[4878]: I1202 19:28:55.290594 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2rn8d" podStartSLOduration=2.6980328829999998 podStartE2EDuration="5.290575871s" podCreationTimestamp="2025-12-02 19:28:50 +0000 UTC" firstStartedPulling="2025-12-02 19:28:52.222608863 +0000 UTC m=+4441.912227744" lastFinishedPulling="2025-12-02 19:28:54.815151841 +0000 UTC m=+4444.504770732" observedRunningTime="2025-12-02 19:28:55.287499295 +0000 UTC m=+4444.977118186" watchObservedRunningTime="2025-12-02 19:28:55.290575871 +0000 UTC m=+4444.980194752" Dec 02 19:29:01 crc kubenswrapper[4878]: I1202 19:29:01.089992 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:29:01 crc kubenswrapper[4878]: I1202 19:29:01.090654 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:29:01 crc kubenswrapper[4878]: I1202 19:29:01.144777 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:29:01 crc kubenswrapper[4878]: I1202 19:29:01.424678 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:29:01 crc kubenswrapper[4878]: I1202 19:29:01.470556 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rn8d"] Dec 02 19:29:03 crc kubenswrapper[4878]: I1202 19:29:03.366253 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2rn8d" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="registry-server" containerID="cri-o://5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770" gracePeriod=2 Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.035722 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.143744 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-utilities\") pod \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.143988 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwbfc\" (UniqueName: \"kubernetes.io/projected/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-kube-api-access-lwbfc\") pod \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.144051 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-catalog-content\") pod \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\" (UID: \"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5\") " Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.147930 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-utilities" (OuterVolumeSpecName: "utilities") pod "bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" (UID: "bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.161158 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-kube-api-access-lwbfc" (OuterVolumeSpecName: "kube-api-access-lwbfc") pod "bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" (UID: "bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5"). InnerVolumeSpecName "kube-api-access-lwbfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.197761 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" (UID: "bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.247010 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.247047 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwbfc\" (UniqueName: \"kubernetes.io/projected/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-kube-api-access-lwbfc\") on node \"crc\" DevicePath \"\"" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.247062 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.385181 4878 generic.go:334] "Generic (PLEG): container finished" podID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerID="5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770" exitCode=0 Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.385605 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rn8d" event={"ID":"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5","Type":"ContainerDied","Data":"5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770"} Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.385644 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rn8d" event={"ID":"bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5","Type":"ContainerDied","Data":"07e52286833baf3db700cb161ee9e7a9cab39cd9b9c3ae44c3860035bdc9bf14"} Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.385673 4878 scope.go:117] "RemoveContainer" containerID="5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.385912 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rn8d" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.427439 4878 scope.go:117] "RemoveContainer" containerID="657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.437556 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rn8d"] Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.449911 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2rn8d"] Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.467226 4878 scope.go:117] "RemoveContainer" containerID="9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.514935 4878 scope.go:117] "RemoveContainer" containerID="5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770" Dec 02 19:29:04 crc kubenswrapper[4878]: E1202 19:29:04.515942 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770\": container with ID starting with 5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770 not found: ID does not exist" containerID="5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.515972 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770"} err="failed to get container status \"5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770\": rpc error: code = NotFound desc = could not find container \"5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770\": container with ID starting with 5dca014c3db428ebcfae94d16217d4953503ab074be59aa96f00bfc95db0a770 not found: ID does not exist" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.515992 4878 scope.go:117] "RemoveContainer" containerID="657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6" Dec 02 19:29:04 crc kubenswrapper[4878]: E1202 19:29:04.520736 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6\": container with ID starting with 657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6 not found: ID does not exist" containerID="657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.520969 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6"} err="failed to get container status \"657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6\": rpc error: code = NotFound desc = could not find container \"657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6\": container with ID starting with 657611343ab34a904d17fd511090d4b63b4bd525d7a6317e724eb298e18021f6 not found: ID does not exist" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.520995 4878 scope.go:117] "RemoveContainer" containerID="9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da" Dec 02 19:29:04 crc kubenswrapper[4878]: E1202 19:29:04.521684 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da\": container with ID starting with 9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da not found: ID does not exist" containerID="9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.521741 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da"} err="failed to get container status \"9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da\": rpc error: code = NotFound desc = could not find container \"9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da\": container with ID starting with 9be9c9145d6c13054c4e76b4f7865e03b39cf62e3f7509832c3317f49bb3d5da not found: ID does not exist" Dec 02 19:29:04 crc kubenswrapper[4878]: I1202 19:29:04.952870 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" path="/var/lib/kubelet/pods/bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5/volumes" Dec 02 19:29:53 crc kubenswrapper[4878]: I1202 19:29:53.742604 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:29:53 crc kubenswrapper[4878]: I1202 19:29:53.743124 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.175297 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h"] Dec 02 19:30:00 crc kubenswrapper[4878]: E1202 19:30:00.176717 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="extract-utilities" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.176741 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="extract-utilities" Dec 02 19:30:00 crc kubenswrapper[4878]: E1202 19:30:00.176801 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="extract-content" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.176813 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="extract-content" Dec 02 19:30:00 crc kubenswrapper[4878]: E1202 19:30:00.176859 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="registry-server" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.176872 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="registry-server" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.177469 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1964fd-ce7b-4e12-a700-eaca9ad7c0f5" containerName="registry-server" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.178889 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.180938 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.181281 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.192123 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h"] Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.303785 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8c98134-d2b5-478c-aa51-bba90e582526-config-volume\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.303932 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8c98134-d2b5-478c-aa51-bba90e582526-secret-volume\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.303992 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7dp\" (UniqueName: \"kubernetes.io/projected/a8c98134-d2b5-478c-aa51-bba90e582526-kube-api-access-zv7dp\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.406292 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8c98134-d2b5-478c-aa51-bba90e582526-secret-volume\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.406579 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7dp\" (UniqueName: \"kubernetes.io/projected/a8c98134-d2b5-478c-aa51-bba90e582526-kube-api-access-zv7dp\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.406741 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8c98134-d2b5-478c-aa51-bba90e582526-config-volume\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.407939 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8c98134-d2b5-478c-aa51-bba90e582526-config-volume\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.412645 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8c98134-d2b5-478c-aa51-bba90e582526-secret-volume\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.425979 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7dp\" (UniqueName: \"kubernetes.io/projected/a8c98134-d2b5-478c-aa51-bba90e582526-kube-api-access-zv7dp\") pod \"collect-profiles-29411730-pwz8h\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.512413 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:00 crc kubenswrapper[4878]: I1202 19:30:00.952301 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h"] Dec 02 19:30:02 crc kubenswrapper[4878]: I1202 19:30:02.129021 4878 generic.go:334] "Generic (PLEG): container finished" podID="a8c98134-d2b5-478c-aa51-bba90e582526" containerID="20459449b0a0d0032e8cfa3aa1fdc0d8eaed2c4595972fe7c91965860fdabb9b" exitCode=0 Dec 02 19:30:02 crc kubenswrapper[4878]: I1202 19:30:02.129183 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" event={"ID":"a8c98134-d2b5-478c-aa51-bba90e582526","Type":"ContainerDied","Data":"20459449b0a0d0032e8cfa3aa1fdc0d8eaed2c4595972fe7c91965860fdabb9b"} Dec 02 19:30:02 crc kubenswrapper[4878]: I1202 19:30:02.129629 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" event={"ID":"a8c98134-d2b5-478c-aa51-bba90e582526","Type":"ContainerStarted","Data":"88d6de55dc5167f2f7577da79a0af860473525a9d7b468412e8c9a7fce939a42"} Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.583068 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.782012 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8c98134-d2b5-478c-aa51-bba90e582526-config-volume\") pod \"a8c98134-d2b5-478c-aa51-bba90e582526\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.782199 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv7dp\" (UniqueName: \"kubernetes.io/projected/a8c98134-d2b5-478c-aa51-bba90e582526-kube-api-access-zv7dp\") pod \"a8c98134-d2b5-478c-aa51-bba90e582526\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.782259 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8c98134-d2b5-478c-aa51-bba90e582526-secret-volume\") pod \"a8c98134-d2b5-478c-aa51-bba90e582526\" (UID: \"a8c98134-d2b5-478c-aa51-bba90e582526\") " Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.783172 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c98134-d2b5-478c-aa51-bba90e582526-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8c98134-d2b5-478c-aa51-bba90e582526" (UID: "a8c98134-d2b5-478c-aa51-bba90e582526"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.788018 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c98134-d2b5-478c-aa51-bba90e582526-kube-api-access-zv7dp" (OuterVolumeSpecName: "kube-api-access-zv7dp") pod "a8c98134-d2b5-478c-aa51-bba90e582526" (UID: "a8c98134-d2b5-478c-aa51-bba90e582526"). InnerVolumeSpecName "kube-api-access-zv7dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.792359 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c98134-d2b5-478c-aa51-bba90e582526-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8c98134-d2b5-478c-aa51-bba90e582526" (UID: "a8c98134-d2b5-478c-aa51-bba90e582526"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.884502 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv7dp\" (UniqueName: \"kubernetes.io/projected/a8c98134-d2b5-478c-aa51-bba90e582526-kube-api-access-zv7dp\") on node \"crc\" DevicePath \"\"" Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.884538 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8c98134-d2b5-478c-aa51-bba90e582526-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:30:03 crc kubenswrapper[4878]: I1202 19:30:03.884550 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8c98134-d2b5-478c-aa51-bba90e582526-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:30:04 crc kubenswrapper[4878]: I1202 19:30:04.152424 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" event={"ID":"a8c98134-d2b5-478c-aa51-bba90e582526","Type":"ContainerDied","Data":"88d6de55dc5167f2f7577da79a0af860473525a9d7b468412e8c9a7fce939a42"} Dec 02 19:30:04 crc kubenswrapper[4878]: I1202 19:30:04.152486 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88d6de55dc5167f2f7577da79a0af860473525a9d7b468412e8c9a7fce939a42" Dec 02 19:30:04 crc kubenswrapper[4878]: I1202 19:30:04.152493 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411730-pwz8h" Dec 02 19:30:04 crc kubenswrapper[4878]: I1202 19:30:04.672769 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq"] Dec 02 19:30:04 crc kubenswrapper[4878]: I1202 19:30:04.691365 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411685-jtggq"] Dec 02 19:30:04 crc kubenswrapper[4878]: I1202 19:30:04.957170 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166b1c46-c3d9-42b9-a1f3-6925d96a6a03" path="/var/lib/kubelet/pods/166b1c46-c3d9-42b9-a1f3-6925d96a6a03/volumes" Dec 02 19:30:05 crc kubenswrapper[4878]: I1202 19:30:05.407268 4878 scope.go:117] "RemoveContainer" containerID="9f9f3d63a16ce958842e1b771f17538e6a32ebe40dd005098b0f8d9a71db00d9" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.454018 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zm2ws"] Dec 02 19:30:21 crc kubenswrapper[4878]: E1202 19:30:21.455686 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c98134-d2b5-478c-aa51-bba90e582526" containerName="collect-profiles" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.455718 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c98134-d2b5-478c-aa51-bba90e582526" containerName="collect-profiles" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.456274 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c98134-d2b5-478c-aa51-bba90e582526" containerName="collect-profiles" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.460005 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.471590 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm2ws"] Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.535049 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-utilities\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.535152 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmgd\" (UniqueName: \"kubernetes.io/projected/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-kube-api-access-tnmgd\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.535222 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-catalog-content\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.638082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-utilities\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.638164 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmgd\" (UniqueName: \"kubernetes.io/projected/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-kube-api-access-tnmgd\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.638211 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-catalog-content\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.638705 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-utilities\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.638853 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-catalog-content\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.668557 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmgd\" (UniqueName: \"kubernetes.io/projected/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-kube-api-access-tnmgd\") pod \"redhat-operators-zm2ws\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:21 crc kubenswrapper[4878]: I1202 19:30:21.804802 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:22 crc kubenswrapper[4878]: I1202 19:30:22.321377 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm2ws"] Dec 02 19:30:22 crc kubenswrapper[4878]: I1202 19:30:22.363703 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm2ws" event={"ID":"aaf9f44a-51be-4fbf-ba3b-35677a65e29b","Type":"ContainerStarted","Data":"92129514198c96dbfe2959013945daf8cd12a49acad32ae7bf05156668d7eec2"} Dec 02 19:30:23 crc kubenswrapper[4878]: I1202 19:30:23.381863 4878 generic.go:334] "Generic (PLEG): container finished" podID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerID="da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd" exitCode=0 Dec 02 19:30:23 crc kubenswrapper[4878]: I1202 19:30:23.381979 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm2ws" event={"ID":"aaf9f44a-51be-4fbf-ba3b-35677a65e29b","Type":"ContainerDied","Data":"da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd"} Dec 02 19:30:23 crc kubenswrapper[4878]: I1202 19:30:23.742515 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:30:23 crc kubenswrapper[4878]: I1202 19:30:23.742600 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:30:25 crc kubenswrapper[4878]: I1202 19:30:25.410941 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm2ws" event={"ID":"aaf9f44a-51be-4fbf-ba3b-35677a65e29b","Type":"ContainerStarted","Data":"8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5"} Dec 02 19:30:28 crc kubenswrapper[4878]: I1202 19:30:28.445720 4878 generic.go:334] "Generic (PLEG): container finished" podID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerID="8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5" exitCode=0 Dec 02 19:30:28 crc kubenswrapper[4878]: I1202 19:30:28.445794 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm2ws" event={"ID":"aaf9f44a-51be-4fbf-ba3b-35677a65e29b","Type":"ContainerDied","Data":"8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5"} Dec 02 19:30:30 crc kubenswrapper[4878]: I1202 19:30:30.469228 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm2ws" event={"ID":"aaf9f44a-51be-4fbf-ba3b-35677a65e29b","Type":"ContainerStarted","Data":"749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939"} Dec 02 19:30:30 crc kubenswrapper[4878]: I1202 19:30:30.496282 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zm2ws" podStartSLOduration=3.642744122 podStartE2EDuration="9.496259237s" podCreationTimestamp="2025-12-02 19:30:21 +0000 UTC" firstStartedPulling="2025-12-02 19:30:23.385756525 +0000 UTC m=+4533.075375406" lastFinishedPulling="2025-12-02 19:30:29.23927164 +0000 UTC m=+4538.928890521" observedRunningTime="2025-12-02 19:30:30.484955013 +0000 UTC m=+4540.174573924" watchObservedRunningTime="2025-12-02 19:30:30.496259237 +0000 UTC m=+4540.185878108" Dec 02 19:30:31 crc kubenswrapper[4878]: I1202 19:30:31.805772 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:31 crc kubenswrapper[4878]: I1202 19:30:31.806087 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:32 crc kubenswrapper[4878]: I1202 19:30:32.879667 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zm2ws" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="registry-server" probeResult="failure" output=< Dec 02 19:30:32 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:30:32 crc kubenswrapper[4878]: > Dec 02 19:30:41 crc kubenswrapper[4878]: I1202 19:30:41.852942 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:41 crc kubenswrapper[4878]: I1202 19:30:41.901050 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:42 crc kubenswrapper[4878]: I1202 19:30:42.095425 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zm2ws"] Dec 02 19:30:43 crc kubenswrapper[4878]: I1202 19:30:43.615719 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zm2ws" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="registry-server" containerID="cri-o://749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939" gracePeriod=2 Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.139511 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.193824 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmgd\" (UniqueName: \"kubernetes.io/projected/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-kube-api-access-tnmgd\") pod \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.193893 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-catalog-content\") pod \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.194028 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-utilities\") pod \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\" (UID: \"aaf9f44a-51be-4fbf-ba3b-35677a65e29b\") " Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.195626 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-utilities" (OuterVolumeSpecName: "utilities") pod "aaf9f44a-51be-4fbf-ba3b-35677a65e29b" (UID: "aaf9f44a-51be-4fbf-ba3b-35677a65e29b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.201695 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-kube-api-access-tnmgd" (OuterVolumeSpecName: "kube-api-access-tnmgd") pod "aaf9f44a-51be-4fbf-ba3b-35677a65e29b" (UID: "aaf9f44a-51be-4fbf-ba3b-35677a65e29b"). InnerVolumeSpecName "kube-api-access-tnmgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.297213 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmgd\" (UniqueName: \"kubernetes.io/projected/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-kube-api-access-tnmgd\") on node \"crc\" DevicePath \"\"" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.297469 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.308699 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aaf9f44a-51be-4fbf-ba3b-35677a65e29b" (UID: "aaf9f44a-51be-4fbf-ba3b-35677a65e29b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.399519 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf9f44a-51be-4fbf-ba3b-35677a65e29b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.628992 4878 generic.go:334] "Generic (PLEG): container finished" podID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerID="749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939" exitCode=0 Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.629052 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm2ws" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.629053 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm2ws" event={"ID":"aaf9f44a-51be-4fbf-ba3b-35677a65e29b","Type":"ContainerDied","Data":"749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939"} Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.630546 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm2ws" event={"ID":"aaf9f44a-51be-4fbf-ba3b-35677a65e29b","Type":"ContainerDied","Data":"92129514198c96dbfe2959013945daf8cd12a49acad32ae7bf05156668d7eec2"} Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.630572 4878 scope.go:117] "RemoveContainer" containerID="749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.666635 4878 scope.go:117] "RemoveContainer" containerID="8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.714656 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zm2ws"] Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.720805 4878 scope.go:117] "RemoveContainer" containerID="da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.736585 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zm2ws"] Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.776324 4878 scope.go:117] "RemoveContainer" containerID="749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939" Dec 02 19:30:44 crc kubenswrapper[4878]: E1202 19:30:44.776657 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939\": container with ID starting with 749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939 not found: ID does not exist" containerID="749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.776696 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939"} err="failed to get container status \"749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939\": rpc error: code = NotFound desc = could not find container \"749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939\": container with ID starting with 749b63297a8a3297a279ee12370f3bfb66d1c1d48a3cc97175c8c69a872b7939 not found: ID does not exist" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.776716 4878 scope.go:117] "RemoveContainer" containerID="8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5" Dec 02 19:30:44 crc kubenswrapper[4878]: E1202 19:30:44.777081 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5\": container with ID starting with 8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5 not found: ID does not exist" containerID="8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.777099 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5"} err="failed to get container status \"8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5\": rpc error: code = NotFound desc = could not find container \"8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5\": container with ID starting with 8e87ab0f7f6d76a11fa3691b72aacef8947acbe20ce9f760d13c046c6ab026d5 not found: ID does not exist" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.777110 4878 scope.go:117] "RemoveContainer" containerID="da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd" Dec 02 19:30:44 crc kubenswrapper[4878]: E1202 19:30:44.777474 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd\": container with ID starting with da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd not found: ID does not exist" containerID="da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.777548 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd"} err="failed to get container status \"da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd\": rpc error: code = NotFound desc = could not find container \"da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd\": container with ID starting with da00a780f78a81785c367c3ff85602b60f814f0357fb53d35efa00691f05e3dd not found: ID does not exist" Dec 02 19:30:44 crc kubenswrapper[4878]: I1202 19:30:44.954103 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" path="/var/lib/kubelet/pods/aaf9f44a-51be-4fbf-ba3b-35677a65e29b/volumes" Dec 02 19:30:53 crc kubenswrapper[4878]: I1202 19:30:53.742650 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:30:53 crc kubenswrapper[4878]: I1202 19:30:53.743175 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:30:53 crc kubenswrapper[4878]: I1202 19:30:53.743222 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:30:53 crc kubenswrapper[4878]: I1202 19:30:53.744071 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:30:53 crc kubenswrapper[4878]: I1202 19:30:53.744118 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" gracePeriod=600 Dec 02 19:30:53 crc kubenswrapper[4878]: E1202 19:30:53.879034 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:30:54 crc kubenswrapper[4878]: I1202 19:30:54.745090 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" exitCode=0 Dec 02 19:30:54 crc kubenswrapper[4878]: I1202 19:30:54.745136 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074"} Dec 02 19:30:54 crc kubenswrapper[4878]: I1202 19:30:54.745176 4878 scope.go:117] "RemoveContainer" containerID="cf6c6e19bfc472b390886743cf5b1b51ea9d0fa8608ca59cf167649361a49ada" Dec 02 19:30:54 crc kubenswrapper[4878]: I1202 19:30:54.746439 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:30:54 crc kubenswrapper[4878]: E1202 19:30:54.747321 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:31:07 crc kubenswrapper[4878]: I1202 19:31:07.938346 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:31:07 crc kubenswrapper[4878]: E1202 19:31:07.939286 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:31:21 crc kubenswrapper[4878]: I1202 19:31:21.938482 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:31:21 crc kubenswrapper[4878]: E1202 19:31:21.939422 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:31:36 crc kubenswrapper[4878]: I1202 19:31:36.939973 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:31:36 crc kubenswrapper[4878]: E1202 19:31:36.940783 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.729886 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-75xz6"] Dec 02 19:31:41 crc kubenswrapper[4878]: E1202 19:31:41.730901 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="extract-content" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.730917 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="extract-content" Dec 02 19:31:41 crc kubenswrapper[4878]: E1202 19:31:41.730954 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="extract-utilities" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.730963 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="extract-utilities" Dec 02 19:31:41 crc kubenswrapper[4878]: E1202 19:31:41.730986 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="registry-server" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.730995 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="registry-server" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.731297 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf9f44a-51be-4fbf-ba3b-35677a65e29b" containerName="registry-server" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.735033 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.757113 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75xz6"] Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.860869 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-catalog-content\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.861185 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6r99\" (UniqueName: \"kubernetes.io/projected/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-kube-api-access-h6r99\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.861663 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-utilities\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.967043 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6r99\" (UniqueName: \"kubernetes.io/projected/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-kube-api-access-h6r99\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.967481 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-utilities\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.967981 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-utilities\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.968108 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-catalog-content\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:41 crc kubenswrapper[4878]: I1202 19:31:41.968489 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-catalog-content\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:42 crc kubenswrapper[4878]: I1202 19:31:42.074375 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6r99\" (UniqueName: \"kubernetes.io/projected/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-kube-api-access-h6r99\") pod \"redhat-marketplace-75xz6\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:42 crc kubenswrapper[4878]: I1202 19:31:42.361016 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:42 crc kubenswrapper[4878]: I1202 19:31:42.840317 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75xz6"] Dec 02 19:31:42 crc kubenswrapper[4878]: W1202 19:31:42.846361 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bcd5fd9_bb04_42bd_b97f_04cb287b51c2.slice/crio-ee61b96bc7b0f67512954010778354bfc2c1064a9826b20567c7721e7aec41e7 WatchSource:0}: Error finding container ee61b96bc7b0f67512954010778354bfc2c1064a9826b20567c7721e7aec41e7: Status 404 returned error can't find the container with id ee61b96bc7b0f67512954010778354bfc2c1064a9826b20567c7721e7aec41e7 Dec 02 19:31:43 crc kubenswrapper[4878]: I1202 19:31:43.500298 4878 generic.go:334] "Generic (PLEG): container finished" podID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerID="bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba" exitCode=0 Dec 02 19:31:43 crc kubenswrapper[4878]: I1202 19:31:43.500367 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75xz6" event={"ID":"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2","Type":"ContainerDied","Data":"bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba"} Dec 02 19:31:43 crc kubenswrapper[4878]: I1202 19:31:43.500647 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75xz6" event={"ID":"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2","Type":"ContainerStarted","Data":"ee61b96bc7b0f67512954010778354bfc2c1064a9826b20567c7721e7aec41e7"} Dec 02 19:31:45 crc kubenswrapper[4878]: I1202 19:31:45.571080 4878 generic.go:334] "Generic (PLEG): container finished" podID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerID="1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2" exitCode=0 Dec 02 19:31:45 crc kubenswrapper[4878]: I1202 19:31:45.571119 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75xz6" event={"ID":"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2","Type":"ContainerDied","Data":"1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2"} Dec 02 19:31:46 crc kubenswrapper[4878]: I1202 19:31:46.587660 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75xz6" event={"ID":"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2","Type":"ContainerStarted","Data":"2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae"} Dec 02 19:31:46 crc kubenswrapper[4878]: I1202 19:31:46.632114 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-75xz6" podStartSLOduration=3.078027842 podStartE2EDuration="5.632089053s" podCreationTimestamp="2025-12-02 19:31:41 +0000 UTC" firstStartedPulling="2025-12-02 19:31:43.502784701 +0000 UTC m=+4613.192403592" lastFinishedPulling="2025-12-02 19:31:46.056845902 +0000 UTC m=+4615.746464803" observedRunningTime="2025-12-02 19:31:46.605800388 +0000 UTC m=+4616.295419279" watchObservedRunningTime="2025-12-02 19:31:46.632089053 +0000 UTC m=+4616.321707934" Dec 02 19:31:47 crc kubenswrapper[4878]: I1202 19:31:47.938684 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:31:47 crc kubenswrapper[4878]: E1202 19:31:47.939438 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:31:52 crc kubenswrapper[4878]: I1202 19:31:52.361730 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:52 crc kubenswrapper[4878]: I1202 19:31:52.363370 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:52 crc kubenswrapper[4878]: I1202 19:31:52.441709 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:52 crc kubenswrapper[4878]: I1202 19:31:52.729403 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:52 crc kubenswrapper[4878]: I1202 19:31:52.784666 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75xz6"] Dec 02 19:31:54 crc kubenswrapper[4878]: I1202 19:31:54.711473 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-75xz6" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="registry-server" containerID="cri-o://2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae" gracePeriod=2 Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.194769 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.324942 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6r99\" (UniqueName: \"kubernetes.io/projected/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-kube-api-access-h6r99\") pod \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.325006 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-utilities\") pod \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.325111 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-catalog-content\") pod \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\" (UID: \"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2\") " Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.326356 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-utilities" (OuterVolumeSpecName: "utilities") pod "5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" (UID: "5bcd5fd9-bb04-42bd-b97f-04cb287b51c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.333496 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-kube-api-access-h6r99" (OuterVolumeSpecName: "kube-api-access-h6r99") pod "5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" (UID: "5bcd5fd9-bb04-42bd-b97f-04cb287b51c2"). InnerVolumeSpecName "kube-api-access-h6r99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.345779 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" (UID: "5bcd5fd9-bb04-42bd-b97f-04cb287b51c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.427273 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6r99\" (UniqueName: \"kubernetes.io/projected/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-kube-api-access-h6r99\") on node \"crc\" DevicePath \"\"" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.427305 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.427316 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.740838 4878 generic.go:334] "Generic (PLEG): container finished" podID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerID="2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae" exitCode=0 Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.740889 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75xz6" event={"ID":"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2","Type":"ContainerDied","Data":"2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae"} Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.740920 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75xz6" event={"ID":"5bcd5fd9-bb04-42bd-b97f-04cb287b51c2","Type":"ContainerDied","Data":"ee61b96bc7b0f67512954010778354bfc2c1064a9826b20567c7721e7aec41e7"} Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.740942 4878 scope.go:117] "RemoveContainer" containerID="2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.741141 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75xz6" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.798019 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75xz6"] Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.805646 4878 scope.go:117] "RemoveContainer" containerID="1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.810937 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-75xz6"] Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.837258 4878 scope.go:117] "RemoveContainer" containerID="bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.882385 4878 scope.go:117] "RemoveContainer" containerID="2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae" Dec 02 19:31:55 crc kubenswrapper[4878]: E1202 19:31:55.883081 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae\": container with ID starting with 2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae not found: ID does not exist" containerID="2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.883142 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae"} err="failed to get container status \"2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae\": rpc error: code = NotFound desc = could not find container \"2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae\": container with ID starting with 2f1412176f1533d1bf6bc172d481c94a5964392e276bfb4e946a69d553bca2ae not found: ID does not exist" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.883181 4878 scope.go:117] "RemoveContainer" containerID="1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2" Dec 02 19:31:55 crc kubenswrapper[4878]: E1202 19:31:55.883985 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2\": container with ID starting with 1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2 not found: ID does not exist" containerID="1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.884037 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2"} err="failed to get container status \"1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2\": rpc error: code = NotFound desc = could not find container \"1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2\": container with ID starting with 1826cdd81fff73fea96464794a27442a6f441b83f00fcac5e4d5532ac3b970f2 not found: ID does not exist" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.884072 4878 scope.go:117] "RemoveContainer" containerID="bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba" Dec 02 19:31:55 crc kubenswrapper[4878]: E1202 19:31:55.884532 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba\": container with ID starting with bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba not found: ID does not exist" containerID="bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba" Dec 02 19:31:55 crc kubenswrapper[4878]: I1202 19:31:55.884564 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba"} err="failed to get container status \"bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba\": rpc error: code = NotFound desc = could not find container \"bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba\": container with ID starting with bbd6602f91ed47085c5da8c095e30483e6b15e67d392de5acfaaaf7c9402b9ba not found: ID does not exist" Dec 02 19:31:56 crc kubenswrapper[4878]: I1202 19:31:56.955886 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" path="/var/lib/kubelet/pods/5bcd5fd9-bb04-42bd-b97f-04cb287b51c2/volumes" Dec 02 19:31:58 crc kubenswrapper[4878]: I1202 19:31:58.939393 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:31:58 crc kubenswrapper[4878]: E1202 19:31:58.940154 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.080489 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 19:32:07 crc kubenswrapper[4878]: E1202 19:32:07.081486 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="registry-server" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.081502 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="registry-server" Dec 02 19:32:07 crc kubenswrapper[4878]: E1202 19:32:07.081521 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="extract-content" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.081527 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="extract-content" Dec 02 19:32:07 crc kubenswrapper[4878]: E1202 19:32:07.081542 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="extract-utilities" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.081548 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="extract-utilities" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.081797 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcd5fd9-bb04-42bd-b97f-04cb287b51c2" containerName="registry-server" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.082720 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.088114 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.088374 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.088577 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.088705 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vztps" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.096940 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126053 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126124 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126172 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126203 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126223 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126278 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-config-data\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126349 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126377 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwfv\" (UniqueName: \"kubernetes.io/projected/e5a4096d-c2be-4987-b0eb-fb47da8a9703-kube-api-access-rbwfv\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.126435 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229075 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229128 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229176 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-config-data\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229277 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229312 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwfv\" (UniqueName: \"kubernetes.io/projected/e5a4096d-c2be-4987-b0eb-fb47da8a9703-kube-api-access-rbwfv\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229348 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229515 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229581 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.229636 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.230164 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.230622 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.230895 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.231461 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-config-data\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.231817 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.236262 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.237336 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.238740 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.251603 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwfv\" (UniqueName: \"kubernetes.io/projected/e5a4096d-c2be-4987-b0eb-fb47da8a9703-kube-api-access-rbwfv\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.268343 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.419562 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 19:32:07 crc kubenswrapper[4878]: I1202 19:32:07.966763 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 19:32:08 crc kubenswrapper[4878]: W1202 19:32:08.891502 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a4096d_c2be_4987_b0eb_fb47da8a9703.slice/crio-94720cc681eb1f953476fef8ed625a6e7a0ec4d5008748a4ce9fd232177347d1 WatchSource:0}: Error finding container 94720cc681eb1f953476fef8ed625a6e7a0ec4d5008748a4ce9fd232177347d1: Status 404 returned error can't find the container with id 94720cc681eb1f953476fef8ed625a6e7a0ec4d5008748a4ce9fd232177347d1 Dec 02 19:32:08 crc kubenswrapper[4878]: I1202 19:32:08.924120 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e5a4096d-c2be-4987-b0eb-fb47da8a9703","Type":"ContainerStarted","Data":"94720cc681eb1f953476fef8ed625a6e7a0ec4d5008748a4ce9fd232177347d1"} Dec 02 19:32:13 crc kubenswrapper[4878]: I1202 19:32:13.939403 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:32:13 crc kubenswrapper[4878]: E1202 19:32:13.940064 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:32:26 crc kubenswrapper[4878]: I1202 19:32:26.942566 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:32:26 crc kubenswrapper[4878]: E1202 19:32:26.945022 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:32:40 crc kubenswrapper[4878]: I1202 19:32:40.958604 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:32:40 crc kubenswrapper[4878]: E1202 19:32:40.960848 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:32:45 crc kubenswrapper[4878]: E1202 19:32:45.740859 4878 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 02 19:32:45 crc kubenswrapper[4878]: E1202 19:32:45.744614 4878 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbwfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e5a4096d-c2be-4987-b0eb-fb47da8a9703): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 19:32:45 crc kubenswrapper[4878]: E1202 19:32:45.746077 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e5a4096d-c2be-4987-b0eb-fb47da8a9703" Dec 02 19:32:46 crc kubenswrapper[4878]: E1202 19:32:46.387578 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e5a4096d-c2be-4987-b0eb-fb47da8a9703" Dec 02 19:32:55 crc kubenswrapper[4878]: I1202 19:32:55.939138 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:32:55 crc kubenswrapper[4878]: E1202 19:32:55.940236 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:33:01 crc kubenswrapper[4878]: I1202 19:33:01.393226 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 19:33:03 crc kubenswrapper[4878]: I1202 19:33:03.577008 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e5a4096d-c2be-4987-b0eb-fb47da8a9703","Type":"ContainerStarted","Data":"107e695fc6966968c956106bdc95595e99604d836e9d7c2737f2f8933ee5b8dd"} Dec 02 19:33:03 crc kubenswrapper[4878]: I1202 19:33:03.608193 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.113744135 podStartE2EDuration="57.608170784s" podCreationTimestamp="2025-12-02 19:32:06 +0000 UTC" firstStartedPulling="2025-12-02 19:32:08.895652663 +0000 UTC m=+4638.585271554" lastFinishedPulling="2025-12-02 19:33:01.390079312 +0000 UTC m=+4691.079698203" observedRunningTime="2025-12-02 19:33:03.599979597 +0000 UTC m=+4693.289598478" watchObservedRunningTime="2025-12-02 19:33:03.608170784 +0000 UTC m=+4693.297789675" Dec 02 19:33:07 crc kubenswrapper[4878]: I1202 19:33:07.939029 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:33:07 crc kubenswrapper[4878]: E1202 19:33:07.940109 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:33:21 crc kubenswrapper[4878]: I1202 19:33:21.938225 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:33:21 crc kubenswrapper[4878]: E1202 19:33:21.939211 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:33:32 crc kubenswrapper[4878]: I1202 19:33:32.938995 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:33:32 crc kubenswrapper[4878]: E1202 19:33:32.939793 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:33:37 crc kubenswrapper[4878]: I1202 19:33:37.980207 4878 patch_prober.go:28] interesting pod/thanos-querier-5d55987f4-pmp97 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.73:9091/-/ready\": context deadline exceeded" start-of-body= Dec 02 19:33:37 crc kubenswrapper[4878]: I1202 19:33:37.980718 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5d55987f4-pmp97" podUID="ea96e843-9776-4b36-84c5-d4b187c87720" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.73:9091/-/ready\": context deadline exceeded" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.018104 4878 patch_prober.go:28] interesting pod/console-57568cf57-qq2hl container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.136:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.018430 4878 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-57568cf57-qq2hl" podUID="409d93c8-2950-4081-b5c8-9c3435e28449" containerName="console" probeResult="failure" output="Get \"https://10.217.0.136:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 19:33:38 crc kubenswrapper[4878]: E1202 19:33:38.132753 4878 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.196s" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.132860 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-26fhr"] Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.140443 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.225979 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-26fhr"] Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.249728 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmqz\" (UniqueName: \"kubernetes.io/projected/17c66444-5e82-4777-82d6-4fb9e52d01a1-kube-api-access-6gmqz\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.250762 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-utilities\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.251188 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-catalog-content\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.355528 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-utilities\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.355642 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-catalog-content\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.355682 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmqz\" (UniqueName: \"kubernetes.io/projected/17c66444-5e82-4777-82d6-4fb9e52d01a1-kube-api-access-6gmqz\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.356713 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-utilities\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.356974 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-catalog-content\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.380817 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmqz\" (UniqueName: \"kubernetes.io/projected/17c66444-5e82-4777-82d6-4fb9e52d01a1-kube-api-access-6gmqz\") pod \"certified-operators-26fhr\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:38 crc kubenswrapper[4878]: I1202 19:33:38.482010 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:39 crc kubenswrapper[4878]: I1202 19:33:39.271731 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-26fhr"] Dec 02 19:33:40 crc kubenswrapper[4878]: I1202 19:33:40.226937 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26fhr" event={"ID":"17c66444-5e82-4777-82d6-4fb9e52d01a1","Type":"ContainerDied","Data":"92bef9d548d49113f0abee28aa4bdebb3d82965d7cc93fbb133f51cd98a7617f"} Dec 02 19:33:40 crc kubenswrapper[4878]: I1202 19:33:40.227145 4878 generic.go:334] "Generic (PLEG): container finished" podID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerID="92bef9d548d49113f0abee28aa4bdebb3d82965d7cc93fbb133f51cd98a7617f" exitCode=0 Dec 02 19:33:40 crc kubenswrapper[4878]: I1202 19:33:40.227441 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26fhr" event={"ID":"17c66444-5e82-4777-82d6-4fb9e52d01a1","Type":"ContainerStarted","Data":"7f326a07f94e983f29cdea4bf70e968ae6047ac7ec54f4e07cd216cf6cd5da24"} Dec 02 19:33:42 crc kubenswrapper[4878]: I1202 19:33:42.256423 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26fhr" event={"ID":"17c66444-5e82-4777-82d6-4fb9e52d01a1","Type":"ContainerStarted","Data":"70a8ed7a0e43c904e5af2cb90c6456391732859b1286b0cf8d9c941e3408bb2a"} Dec 02 19:33:43 crc kubenswrapper[4878]: I1202 19:33:43.271352 4878 generic.go:334] "Generic (PLEG): container finished" podID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerID="70a8ed7a0e43c904e5af2cb90c6456391732859b1286b0cf8d9c941e3408bb2a" exitCode=0 Dec 02 19:33:43 crc kubenswrapper[4878]: I1202 19:33:43.271412 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26fhr" event={"ID":"17c66444-5e82-4777-82d6-4fb9e52d01a1","Type":"ContainerDied","Data":"70a8ed7a0e43c904e5af2cb90c6456391732859b1286b0cf8d9c941e3408bb2a"} Dec 02 19:33:44 crc kubenswrapper[4878]: I1202 19:33:44.939189 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:33:44 crc kubenswrapper[4878]: E1202 19:33:44.940515 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:33:45 crc kubenswrapper[4878]: I1202 19:33:45.297211 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26fhr" event={"ID":"17c66444-5e82-4777-82d6-4fb9e52d01a1","Type":"ContainerStarted","Data":"56b05cc4d300d2ab21799ab3a027eff878c2fc084b02d72323e4316334cf10ca"} Dec 02 19:33:45 crc kubenswrapper[4878]: I1202 19:33:45.323679 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-26fhr" podStartSLOduration=4.438828311 podStartE2EDuration="8.323660132s" podCreationTimestamp="2025-12-02 19:33:37 +0000 UTC" firstStartedPulling="2025-12-02 19:33:40.22957959 +0000 UTC m=+4729.919198471" lastFinishedPulling="2025-12-02 19:33:44.114411411 +0000 UTC m=+4733.804030292" observedRunningTime="2025-12-02 19:33:45.314362661 +0000 UTC m=+4735.003981562" watchObservedRunningTime="2025-12-02 19:33:45.323660132 +0000 UTC m=+4735.013279013" Dec 02 19:33:48 crc kubenswrapper[4878]: I1202 19:33:48.483149 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:48 crc kubenswrapper[4878]: I1202 19:33:48.483739 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:49 crc kubenswrapper[4878]: I1202 19:33:49.549819 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-26fhr" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="registry-server" probeResult="failure" output=< Dec 02 19:33:49 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:33:49 crc kubenswrapper[4878]: > Dec 02 19:33:55 crc kubenswrapper[4878]: I1202 19:33:55.938162 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:33:55 crc kubenswrapper[4878]: E1202 19:33:55.939051 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:33:58 crc kubenswrapper[4878]: I1202 19:33:58.554931 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:58 crc kubenswrapper[4878]: I1202 19:33:58.683193 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:33:58 crc kubenswrapper[4878]: I1202 19:33:58.868282 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-26fhr"] Dec 02 19:34:00 crc kubenswrapper[4878]: I1202 19:34:00.473544 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-26fhr" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="registry-server" containerID="cri-o://56b05cc4d300d2ab21799ab3a027eff878c2fc084b02d72323e4316334cf10ca" gracePeriod=2 Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.531765 4878 generic.go:334] "Generic (PLEG): container finished" podID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerID="56b05cc4d300d2ab21799ab3a027eff878c2fc084b02d72323e4316334cf10ca" exitCode=0 Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.532387 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26fhr" event={"ID":"17c66444-5e82-4777-82d6-4fb9e52d01a1","Type":"ContainerDied","Data":"56b05cc4d300d2ab21799ab3a027eff878c2fc084b02d72323e4316334cf10ca"} Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.672538 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.809979 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-catalog-content\") pod \"17c66444-5e82-4777-82d6-4fb9e52d01a1\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.810188 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-utilities\") pod \"17c66444-5e82-4777-82d6-4fb9e52d01a1\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.810306 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gmqz\" (UniqueName: \"kubernetes.io/projected/17c66444-5e82-4777-82d6-4fb9e52d01a1-kube-api-access-6gmqz\") pod \"17c66444-5e82-4777-82d6-4fb9e52d01a1\" (UID: \"17c66444-5e82-4777-82d6-4fb9e52d01a1\") " Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.811414 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-utilities" (OuterVolumeSpecName: "utilities") pod "17c66444-5e82-4777-82d6-4fb9e52d01a1" (UID: "17c66444-5e82-4777-82d6-4fb9e52d01a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.825636 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c66444-5e82-4777-82d6-4fb9e52d01a1-kube-api-access-6gmqz" (OuterVolumeSpecName: "kube-api-access-6gmqz") pod "17c66444-5e82-4777-82d6-4fb9e52d01a1" (UID: "17c66444-5e82-4777-82d6-4fb9e52d01a1"). InnerVolumeSpecName "kube-api-access-6gmqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.847862 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17c66444-5e82-4777-82d6-4fb9e52d01a1" (UID: "17c66444-5e82-4777-82d6-4fb9e52d01a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.913509 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.913555 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gmqz\" (UniqueName: \"kubernetes.io/projected/17c66444-5e82-4777-82d6-4fb9e52d01a1-kube-api-access-6gmqz\") on node \"crc\" DevicePath \"\"" Dec 02 19:34:01 crc kubenswrapper[4878]: I1202 19:34:01.913569 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17c66444-5e82-4777-82d6-4fb9e52d01a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:34:02 crc kubenswrapper[4878]: I1202 19:34:02.542971 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26fhr" event={"ID":"17c66444-5e82-4777-82d6-4fb9e52d01a1","Type":"ContainerDied","Data":"7f326a07f94e983f29cdea4bf70e968ae6047ac7ec54f4e07cd216cf6cd5da24"} Dec 02 19:34:02 crc kubenswrapper[4878]: I1202 19:34:02.543610 4878 scope.go:117] "RemoveContainer" containerID="56b05cc4d300d2ab21799ab3a027eff878c2fc084b02d72323e4316334cf10ca" Dec 02 19:34:02 crc kubenswrapper[4878]: I1202 19:34:02.543815 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26fhr" Dec 02 19:34:02 crc kubenswrapper[4878]: I1202 19:34:02.586946 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-26fhr"] Dec 02 19:34:02 crc kubenswrapper[4878]: I1202 19:34:02.597697 4878 scope.go:117] "RemoveContainer" containerID="70a8ed7a0e43c904e5af2cb90c6456391732859b1286b0cf8d9c941e3408bb2a" Dec 02 19:34:02 crc kubenswrapper[4878]: I1202 19:34:02.600135 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-26fhr"] Dec 02 19:34:03 crc kubenswrapper[4878]: I1202 19:34:03.014919 4878 scope.go:117] "RemoveContainer" containerID="92bef9d548d49113f0abee28aa4bdebb3d82965d7cc93fbb133f51cd98a7617f" Dec 02 19:34:03 crc kubenswrapper[4878]: I1202 19:34:03.028161 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" path="/var/lib/kubelet/pods/17c66444-5e82-4777-82d6-4fb9e52d01a1/volumes" Dec 02 19:34:10 crc kubenswrapper[4878]: I1202 19:34:10.948884 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:34:10 crc kubenswrapper[4878]: E1202 19:34:10.949728 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:34:22 crc kubenswrapper[4878]: I1202 19:34:22.939063 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:34:22 crc kubenswrapper[4878]: E1202 19:34:22.939876 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:34:34 crc kubenswrapper[4878]: I1202 19:34:34.939114 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:34:34 crc kubenswrapper[4878]: E1202 19:34:34.940091 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:34:48 crc kubenswrapper[4878]: I1202 19:34:48.938751 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:34:48 crc kubenswrapper[4878]: E1202 19:34:48.939737 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:35:03 crc kubenswrapper[4878]: I1202 19:35:03.938769 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:35:03 crc kubenswrapper[4878]: E1202 19:35:03.941539 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:35:15 crc kubenswrapper[4878]: I1202 19:35:15.938191 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:35:15 crc kubenswrapper[4878]: E1202 19:35:15.939155 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:35:26 crc kubenswrapper[4878]: I1202 19:35:26.938596 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:35:26 crc kubenswrapper[4878]: E1202 19:35:26.940841 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:35:37 crc kubenswrapper[4878]: I1202 19:35:37.938616 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:35:37 crc kubenswrapper[4878]: E1202 19:35:37.939538 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:35:50 crc kubenswrapper[4878]: I1202 19:35:50.954893 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:35:50 crc kubenswrapper[4878]: E1202 19:35:50.955868 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:36:02 crc kubenswrapper[4878]: I1202 19:36:02.938343 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:36:03 crc kubenswrapper[4878]: I1202 19:36:03.637707 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"10aced38f4d88a43bcff0a819879536205f756760990e6a619e365b4851df7fb"} Dec 02 19:38:23 crc kubenswrapper[4878]: I1202 19:38:23.742871 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:38:23 crc kubenswrapper[4878]: I1202 19:38:23.743915 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:38:53 crc kubenswrapper[4878]: I1202 19:38:53.742921 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:38:53 crc kubenswrapper[4878]: I1202 19:38:53.743363 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:39:23 crc kubenswrapper[4878]: I1202 19:39:23.742482 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:39:23 crc kubenswrapper[4878]: I1202 19:39:23.743136 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:39:23 crc kubenswrapper[4878]: I1202 19:39:23.743213 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:39:23 crc kubenswrapper[4878]: I1202 19:39:23.745283 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10aced38f4d88a43bcff0a819879536205f756760990e6a619e365b4851df7fb"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:39:23 crc kubenswrapper[4878]: I1202 19:39:23.745674 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://10aced38f4d88a43bcff0a819879536205f756760990e6a619e365b4851df7fb" gracePeriod=600 Dec 02 19:39:24 crc kubenswrapper[4878]: I1202 19:39:24.193925 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="10aced38f4d88a43bcff0a819879536205f756760990e6a619e365b4851df7fb" exitCode=0 Dec 02 19:39:24 crc kubenswrapper[4878]: I1202 19:39:24.194123 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"10aced38f4d88a43bcff0a819879536205f756760990e6a619e365b4851df7fb"} Dec 02 19:39:24 crc kubenswrapper[4878]: I1202 19:39:24.195355 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12"} Dec 02 19:39:24 crc kubenswrapper[4878]: I1202 19:39:24.195565 4878 scope.go:117] "RemoveContainer" containerID="1d5f0940e7683a1bd228898af76cb134027334cef642fd47b5e0eefd6fb1c074" Dec 02 19:39:52 crc kubenswrapper[4878]: I1202 19:39:52.489703 4878 trace.go:236] Trace[1495425792]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-index-gateway-0" (02-Dec-2025 19:39:51.414) (total time: 1074ms): Dec 02 19:39:52 crc kubenswrapper[4878]: Trace[1495425792]: [1.074442755s] [1.074442755s] END Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.751647 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fvflc"] Dec 02 19:40:35 crc kubenswrapper[4878]: E1202 19:40:35.753159 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="extract-utilities" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.753182 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="extract-utilities" Dec 02 19:40:35 crc kubenswrapper[4878]: E1202 19:40:35.753485 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="registry-server" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.753502 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="registry-server" Dec 02 19:40:35 crc kubenswrapper[4878]: E1202 19:40:35.753543 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="extract-content" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.753554 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="extract-content" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.754122 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c66444-5e82-4777-82d6-4fb9e52d01a1" containerName="registry-server" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.760402 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.774295 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvflc"] Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.875138 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gqlh\" (UniqueName: \"kubernetes.io/projected/6a8db07b-96b4-4408-b521-a17e7e368de9-kube-api-access-8gqlh\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.875810 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-utilities\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.876070 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-catalog-content\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.978895 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-utilities\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.978986 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-catalog-content\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:35 crc kubenswrapper[4878]: I1202 19:40:35.979144 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gqlh\" (UniqueName: \"kubernetes.io/projected/6a8db07b-96b4-4408-b521-a17e7e368de9-kube-api-access-8gqlh\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:36 crc kubenswrapper[4878]: I1202 19:40:36.198041 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-utilities\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:36 crc kubenswrapper[4878]: I1202 19:40:36.198353 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-catalog-content\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:36 crc kubenswrapper[4878]: I1202 19:40:36.228304 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gqlh\" (UniqueName: \"kubernetes.io/projected/6a8db07b-96b4-4408-b521-a17e7e368de9-kube-api-access-8gqlh\") pod \"community-operators-fvflc\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:36 crc kubenswrapper[4878]: I1202 19:40:36.393326 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.010421 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fvflc"] Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.221031 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvflc" event={"ID":"6a8db07b-96b4-4408-b521-a17e7e368de9","Type":"ContainerStarted","Data":"8086550cd456ed43c560528a90ff6be3822845f743706b49de70edbd5d3a0cc7"} Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.517807 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2cqzd"] Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.522474 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.531152 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-utilities\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.531202 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-catalog-content\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.531277 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-885jt\" (UniqueName: \"kubernetes.io/projected/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-kube-api-access-885jt\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.543201 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2cqzd"] Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.633845 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-utilities\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.633896 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-catalog-content\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.633952 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-885jt\" (UniqueName: \"kubernetes.io/projected/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-kube-api-access-885jt\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.634390 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-utilities\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.634727 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-catalog-content\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.654913 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-885jt\" (UniqueName: \"kubernetes.io/projected/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-kube-api-access-885jt\") pod \"redhat-operators-2cqzd\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:37 crc kubenswrapper[4878]: I1202 19:40:37.901260 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:38 crc kubenswrapper[4878]: I1202 19:40:38.231798 4878 generic.go:334] "Generic (PLEG): container finished" podID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerID="489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f" exitCode=0 Dec 02 19:40:38 crc kubenswrapper[4878]: I1202 19:40:38.232062 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvflc" event={"ID":"6a8db07b-96b4-4408-b521-a17e7e368de9","Type":"ContainerDied","Data":"489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f"} Dec 02 19:40:38 crc kubenswrapper[4878]: I1202 19:40:38.241416 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:40:38 crc kubenswrapper[4878]: W1202 19:40:38.389703 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb32dbe92_5cd4_4a91_990b_f3fc9d82a487.slice/crio-9a8a129ebde7c576b005ccdf8841c8d7a53e6419404e5ba48264f6dad4b6a08b WatchSource:0}: Error finding container 9a8a129ebde7c576b005ccdf8841c8d7a53e6419404e5ba48264f6dad4b6a08b: Status 404 returned error can't find the container with id 9a8a129ebde7c576b005ccdf8841c8d7a53e6419404e5ba48264f6dad4b6a08b Dec 02 19:40:38 crc kubenswrapper[4878]: I1202 19:40:38.391770 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2cqzd"] Dec 02 19:40:39 crc kubenswrapper[4878]: I1202 19:40:39.245949 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvflc" event={"ID":"6a8db07b-96b4-4408-b521-a17e7e368de9","Type":"ContainerStarted","Data":"cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff"} Dec 02 19:40:39 crc kubenswrapper[4878]: I1202 19:40:39.248347 4878 generic.go:334] "Generic (PLEG): container finished" podID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerID="c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82" exitCode=0 Dec 02 19:40:39 crc kubenswrapper[4878]: I1202 19:40:39.248398 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cqzd" event={"ID":"b32dbe92-5cd4-4a91-990b-f3fc9d82a487","Type":"ContainerDied","Data":"c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82"} Dec 02 19:40:39 crc kubenswrapper[4878]: I1202 19:40:39.248422 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cqzd" event={"ID":"b32dbe92-5cd4-4a91-990b-f3fc9d82a487","Type":"ContainerStarted","Data":"9a8a129ebde7c576b005ccdf8841c8d7a53e6419404e5ba48264f6dad4b6a08b"} Dec 02 19:40:41 crc kubenswrapper[4878]: I1202 19:40:41.283377 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cqzd" event={"ID":"b32dbe92-5cd4-4a91-990b-f3fc9d82a487","Type":"ContainerStarted","Data":"5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330"} Dec 02 19:40:41 crc kubenswrapper[4878]: I1202 19:40:41.286584 4878 generic.go:334] "Generic (PLEG): container finished" podID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerID="cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff" exitCode=0 Dec 02 19:40:41 crc kubenswrapper[4878]: I1202 19:40:41.286835 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvflc" event={"ID":"6a8db07b-96b4-4408-b521-a17e7e368de9","Type":"ContainerDied","Data":"cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff"} Dec 02 19:40:44 crc kubenswrapper[4878]: I1202 19:40:44.335905 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvflc" event={"ID":"6a8db07b-96b4-4408-b521-a17e7e368de9","Type":"ContainerStarted","Data":"d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973"} Dec 02 19:40:44 crc kubenswrapper[4878]: I1202 19:40:44.338215 4878 generic.go:334] "Generic (PLEG): container finished" podID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerID="5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330" exitCode=0 Dec 02 19:40:44 crc kubenswrapper[4878]: I1202 19:40:44.338281 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cqzd" event={"ID":"b32dbe92-5cd4-4a91-990b-f3fc9d82a487","Type":"ContainerDied","Data":"5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330"} Dec 02 19:40:44 crc kubenswrapper[4878]: I1202 19:40:44.378205 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fvflc" podStartSLOduration=5.679094118 podStartE2EDuration="9.377862527s" podCreationTimestamp="2025-12-02 19:40:35 +0000 UTC" firstStartedPulling="2025-12-02 19:40:38.237686885 +0000 UTC m=+5147.927305756" lastFinishedPulling="2025-12-02 19:40:41.936455284 +0000 UTC m=+5151.626074165" observedRunningTime="2025-12-02 19:40:44.361071173 +0000 UTC m=+5154.050690054" watchObservedRunningTime="2025-12-02 19:40:44.377862527 +0000 UTC m=+5154.067481428" Dec 02 19:40:45 crc kubenswrapper[4878]: I1202 19:40:45.360932 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cqzd" event={"ID":"b32dbe92-5cd4-4a91-990b-f3fc9d82a487","Type":"ContainerStarted","Data":"27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1"} Dec 02 19:40:45 crc kubenswrapper[4878]: I1202 19:40:45.389497 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2cqzd" podStartSLOduration=2.869320426 podStartE2EDuration="8.389476081s" podCreationTimestamp="2025-12-02 19:40:37 +0000 UTC" firstStartedPulling="2025-12-02 19:40:39.250171706 +0000 UTC m=+5148.939790587" lastFinishedPulling="2025-12-02 19:40:44.770327341 +0000 UTC m=+5154.459946242" observedRunningTime="2025-12-02 19:40:45.378672924 +0000 UTC m=+5155.068291805" watchObservedRunningTime="2025-12-02 19:40:45.389476081 +0000 UTC m=+5155.079094962" Dec 02 19:40:46 crc kubenswrapper[4878]: I1202 19:40:46.393724 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:46 crc kubenswrapper[4878]: I1202 19:40:46.394075 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:47 crc kubenswrapper[4878]: I1202 19:40:47.447780 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fvflc" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="registry-server" probeResult="failure" output=< Dec 02 19:40:47 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:40:47 crc kubenswrapper[4878]: > Dec 02 19:40:47 crc kubenswrapper[4878]: I1202 19:40:47.902691 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:47 crc kubenswrapper[4878]: I1202 19:40:47.902747 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:40:49 crc kubenswrapper[4878]: I1202 19:40:49.007792 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2cqzd" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="registry-server" probeResult="failure" output=< Dec 02 19:40:49 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:40:49 crc kubenswrapper[4878]: > Dec 02 19:40:56 crc kubenswrapper[4878]: I1202 19:40:56.453194 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:56 crc kubenswrapper[4878]: I1202 19:40:56.511540 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:56 crc kubenswrapper[4878]: I1202 19:40:56.700551 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvflc"] Dec 02 19:40:57 crc kubenswrapper[4878]: I1202 19:40:57.489532 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fvflc" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="registry-server" containerID="cri-o://d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973" gracePeriod=2 Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.068880 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.162110 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-catalog-content\") pod \"6a8db07b-96b4-4408-b521-a17e7e368de9\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.162167 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-utilities\") pod \"6a8db07b-96b4-4408-b521-a17e7e368de9\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.162217 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gqlh\" (UniqueName: \"kubernetes.io/projected/6a8db07b-96b4-4408-b521-a17e7e368de9-kube-api-access-8gqlh\") pod \"6a8db07b-96b4-4408-b521-a17e7e368de9\" (UID: \"6a8db07b-96b4-4408-b521-a17e7e368de9\") " Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.163214 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-utilities" (OuterVolumeSpecName: "utilities") pod "6a8db07b-96b4-4408-b521-a17e7e368de9" (UID: "6a8db07b-96b4-4408-b521-a17e7e368de9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.175148 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8db07b-96b4-4408-b521-a17e7e368de9-kube-api-access-8gqlh" (OuterVolumeSpecName: "kube-api-access-8gqlh") pod "6a8db07b-96b4-4408-b521-a17e7e368de9" (UID: "6a8db07b-96b4-4408-b521-a17e7e368de9"). InnerVolumeSpecName "kube-api-access-8gqlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.229430 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a8db07b-96b4-4408-b521-a17e7e368de9" (UID: "6a8db07b-96b4-4408-b521-a17e7e368de9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.265324 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.265628 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a8db07b-96b4-4408-b521-a17e7e368de9-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.265704 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gqlh\" (UniqueName: \"kubernetes.io/projected/6a8db07b-96b4-4408-b521-a17e7e368de9-kube-api-access-8gqlh\") on node \"crc\" DevicePath \"\"" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.501681 4878 generic.go:334] "Generic (PLEG): container finished" podID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerID="d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973" exitCode=0 Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.501733 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvflc" event={"ID":"6a8db07b-96b4-4408-b521-a17e7e368de9","Type":"ContainerDied","Data":"d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973"} Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.501793 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fvflc" event={"ID":"6a8db07b-96b4-4408-b521-a17e7e368de9","Type":"ContainerDied","Data":"8086550cd456ed43c560528a90ff6be3822845f743706b49de70edbd5d3a0cc7"} Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.501816 4878 scope.go:117] "RemoveContainer" containerID="d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.502829 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fvflc" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.539391 4878 scope.go:117] "RemoveContainer" containerID="cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.544852 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fvflc"] Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.556311 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fvflc"] Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.569875 4878 scope.go:117] "RemoveContainer" containerID="489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.618103 4878 scope.go:117] "RemoveContainer" containerID="d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973" Dec 02 19:40:58 crc kubenswrapper[4878]: E1202 19:40:58.619149 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973\": container with ID starting with d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973 not found: ID does not exist" containerID="d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.619407 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973"} err="failed to get container status \"d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973\": rpc error: code = NotFound desc = could not find container \"d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973\": container with ID starting with d64d22cca5ada22ad610eea208dd571986a7331a19ef0d18747f7da6bcf2d973 not found: ID does not exist" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.619449 4878 scope.go:117] "RemoveContainer" containerID="cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff" Dec 02 19:40:58 crc kubenswrapper[4878]: E1202 19:40:58.619823 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff\": container with ID starting with cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff not found: ID does not exist" containerID="cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.619860 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff"} err="failed to get container status \"cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff\": rpc error: code = NotFound desc = could not find container \"cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff\": container with ID starting with cd0443e58ddaf171830f3f3ef2789b3603953a5fc3391c219044cbf9abdc00ff not found: ID does not exist" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.619884 4878 scope.go:117] "RemoveContainer" containerID="489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f" Dec 02 19:40:58 crc kubenswrapper[4878]: E1202 19:40:58.620254 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f\": container with ID starting with 489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f not found: ID does not exist" containerID="489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.620329 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f"} err="failed to get container status \"489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f\": rpc error: code = NotFound desc = could not find container \"489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f\": container with ID starting with 489382ed8431ea14cd43ff66fed2fea0d6a8043090619468e333ec2db740d20f not found: ID does not exist" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.961590 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" path="/var/lib/kubelet/pods/6a8db07b-96b4-4408-b521-a17e7e368de9/volumes" Dec 02 19:40:58 crc kubenswrapper[4878]: I1202 19:40:58.977310 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2cqzd" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="registry-server" probeResult="failure" output=< Dec 02 19:40:58 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:40:58 crc kubenswrapper[4878]: > Dec 02 19:41:07 crc kubenswrapper[4878]: I1202 19:41:07.958921 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:41:08 crc kubenswrapper[4878]: I1202 19:41:08.019816 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:41:08 crc kubenswrapper[4878]: I1202 19:41:08.733600 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2cqzd"] Dec 02 19:41:09 crc kubenswrapper[4878]: I1202 19:41:09.666623 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2cqzd" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="registry-server" containerID="cri-o://27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1" gracePeriod=2 Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.465451 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.633175 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-utilities\") pod \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.633611 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-catalog-content\") pod \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.633857 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-885jt\" (UniqueName: \"kubernetes.io/projected/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-kube-api-access-885jt\") pod \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\" (UID: \"b32dbe92-5cd4-4a91-990b-f3fc9d82a487\") " Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.634360 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-utilities" (OuterVolumeSpecName: "utilities") pod "b32dbe92-5cd4-4a91-990b-f3fc9d82a487" (UID: "b32dbe92-5cd4-4a91-990b-f3fc9d82a487"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.634831 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.648569 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-kube-api-access-885jt" (OuterVolumeSpecName: "kube-api-access-885jt") pod "b32dbe92-5cd4-4a91-990b-f3fc9d82a487" (UID: "b32dbe92-5cd4-4a91-990b-f3fc9d82a487"). InnerVolumeSpecName "kube-api-access-885jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.697795 4878 generic.go:334] "Generic (PLEG): container finished" podID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerID="27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1" exitCode=0 Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.697847 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cqzd" event={"ID":"b32dbe92-5cd4-4a91-990b-f3fc9d82a487","Type":"ContainerDied","Data":"27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1"} Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.697880 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cqzd" event={"ID":"b32dbe92-5cd4-4a91-990b-f3fc9d82a487","Type":"ContainerDied","Data":"9a8a129ebde7c576b005ccdf8841c8d7a53e6419404e5ba48264f6dad4b6a08b"} Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.697906 4878 scope.go:117] "RemoveContainer" containerID="27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.698087 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cqzd" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.724280 4878 scope.go:117] "RemoveContainer" containerID="5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.739193 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-885jt\" (UniqueName: \"kubernetes.io/projected/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-kube-api-access-885jt\") on node \"crc\" DevicePath \"\"" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.769395 4878 scope.go:117] "RemoveContainer" containerID="c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.806707 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b32dbe92-5cd4-4a91-990b-f3fc9d82a487" (UID: "b32dbe92-5cd4-4a91-990b-f3fc9d82a487"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.808103 4878 scope.go:117] "RemoveContainer" containerID="27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1" Dec 02 19:41:10 crc kubenswrapper[4878]: E1202 19:41:10.808643 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1\": container with ID starting with 27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1 not found: ID does not exist" containerID="27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.808714 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1"} err="failed to get container status \"27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1\": rpc error: code = NotFound desc = could not find container \"27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1\": container with ID starting with 27ae0618ed0bdfeb89799206ec7ba3119efa86e3e78d236930a5ad76edcc2ed1 not found: ID does not exist" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.808744 4878 scope.go:117] "RemoveContainer" containerID="5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330" Dec 02 19:41:10 crc kubenswrapper[4878]: E1202 19:41:10.809069 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330\": container with ID starting with 5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330 not found: ID does not exist" containerID="5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.809092 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330"} err="failed to get container status \"5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330\": rpc error: code = NotFound desc = could not find container \"5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330\": container with ID starting with 5fd9d51753c63b8a6f4d24d375c7b4b75d78781383cf71f1f040457195d69330 not found: ID does not exist" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.809130 4878 scope.go:117] "RemoveContainer" containerID="c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82" Dec 02 19:41:10 crc kubenswrapper[4878]: E1202 19:41:10.809341 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82\": container with ID starting with c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82 not found: ID does not exist" containerID="c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.809359 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82"} err="failed to get container status \"c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82\": rpc error: code = NotFound desc = could not find container \"c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82\": container with ID starting with c428433b789c54779a5caaf321401374a1001b385f29f27a45c0e8929957bf82 not found: ID does not exist" Dec 02 19:41:10 crc kubenswrapper[4878]: I1202 19:41:10.841214 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b32dbe92-5cd4-4a91-990b-f3fc9d82a487-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:41:11 crc kubenswrapper[4878]: I1202 19:41:11.033273 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2cqzd"] Dec 02 19:41:11 crc kubenswrapper[4878]: I1202 19:41:11.046604 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2cqzd"] Dec 02 19:41:12 crc kubenswrapper[4878]: I1202 19:41:12.957392 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" path="/var/lib/kubelet/pods/b32dbe92-5cd4-4a91-990b-f3fc9d82a487/volumes" Dec 02 19:41:53 crc kubenswrapper[4878]: I1202 19:41:53.742384 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:41:53 crc kubenswrapper[4878]: I1202 19:41:53.742892 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.898708 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b885s"] Dec 02 19:41:54 crc kubenswrapper[4878]: E1202 19:41:54.900113 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="extract-content" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900137 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="extract-content" Dec 02 19:41:54 crc kubenswrapper[4878]: E1202 19:41:54.900160 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="registry-server" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900173 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="registry-server" Dec 02 19:41:54 crc kubenswrapper[4878]: E1202 19:41:54.900200 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="extract-content" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900213 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="extract-content" Dec 02 19:41:54 crc kubenswrapper[4878]: E1202 19:41:54.900299 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="extract-utilities" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900313 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="extract-utilities" Dec 02 19:41:54 crc kubenswrapper[4878]: E1202 19:41:54.900341 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="extract-utilities" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900353 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="extract-utilities" Dec 02 19:41:54 crc kubenswrapper[4878]: E1202 19:41:54.900399 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="registry-server" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900417 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="registry-server" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900871 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8db07b-96b4-4408-b521-a17e7e368de9" containerName="registry-server" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.900921 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32dbe92-5cd4-4a91-990b-f3fc9d82a487" containerName="registry-server" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.904232 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.922352 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b885s"] Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.993914 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-catalog-content\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.994093 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvvhm\" (UniqueName: \"kubernetes.io/projected/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-kube-api-access-zvvhm\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:54 crc kubenswrapper[4878]: I1202 19:41:54.994277 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-utilities\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.095854 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvvhm\" (UniqueName: \"kubernetes.io/projected/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-kube-api-access-zvvhm\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.096009 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-utilities\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.096333 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-catalog-content\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.096637 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-utilities\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.096830 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-catalog-content\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.116428 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvvhm\" (UniqueName: \"kubernetes.io/projected/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-kube-api-access-zvvhm\") pod \"redhat-marketplace-b885s\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.246835 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:41:55 crc kubenswrapper[4878]: I1202 19:41:55.809914 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b885s"] Dec 02 19:41:56 crc kubenswrapper[4878]: I1202 19:41:56.348730 4878 generic.go:334] "Generic (PLEG): container finished" podID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerID="2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727" exitCode=0 Dec 02 19:41:56 crc kubenswrapper[4878]: I1202 19:41:56.348811 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b885s" event={"ID":"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5","Type":"ContainerDied","Data":"2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727"} Dec 02 19:41:56 crc kubenswrapper[4878]: I1202 19:41:56.349045 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b885s" event={"ID":"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5","Type":"ContainerStarted","Data":"2c51cfb99ff739f19a408192f039252e6acdfb50dad3b697a4ef4f553abee0a5"} Dec 02 19:41:57 crc kubenswrapper[4878]: I1202 19:41:57.360752 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b885s" event={"ID":"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5","Type":"ContainerStarted","Data":"17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55"} Dec 02 19:41:58 crc kubenswrapper[4878]: I1202 19:41:58.374654 4878 generic.go:334] "Generic (PLEG): container finished" podID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerID="17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55" exitCode=0 Dec 02 19:41:58 crc kubenswrapper[4878]: I1202 19:41:58.374747 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b885s" event={"ID":"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5","Type":"ContainerDied","Data":"17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55"} Dec 02 19:41:59 crc kubenswrapper[4878]: I1202 19:41:59.404193 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b885s" event={"ID":"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5","Type":"ContainerStarted","Data":"715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5"} Dec 02 19:41:59 crc kubenswrapper[4878]: I1202 19:41:59.433435 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b885s" podStartSLOduration=3.052405179 podStartE2EDuration="5.433417586s" podCreationTimestamp="2025-12-02 19:41:54 +0000 UTC" firstStartedPulling="2025-12-02 19:41:56.353416106 +0000 UTC m=+5226.043035007" lastFinishedPulling="2025-12-02 19:41:58.734428533 +0000 UTC m=+5228.424047414" observedRunningTime="2025-12-02 19:41:59.424381805 +0000 UTC m=+5229.114000696" watchObservedRunningTime="2025-12-02 19:41:59.433417586 +0000 UTC m=+5229.123036467" Dec 02 19:42:05 crc kubenswrapper[4878]: I1202 19:42:05.247198 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:42:05 crc kubenswrapper[4878]: I1202 19:42:05.247937 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:42:05 crc kubenswrapper[4878]: I1202 19:42:05.335365 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:42:05 crc kubenswrapper[4878]: I1202 19:42:05.553687 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:42:05 crc kubenswrapper[4878]: I1202 19:42:05.637309 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b885s"] Dec 02 19:42:07 crc kubenswrapper[4878]: I1202 19:42:07.529680 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b885s" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="registry-server" containerID="cri-o://715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5" gracePeriod=2 Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.215735 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.296872 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvvhm\" (UniqueName: \"kubernetes.io/projected/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-kube-api-access-zvvhm\") pod \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.297086 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-utilities\") pod \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.297222 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-catalog-content\") pod \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\" (UID: \"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5\") " Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.298145 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-utilities" (OuterVolumeSpecName: "utilities") pod "3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" (UID: "3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.298798 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.304850 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-kube-api-access-zvvhm" (OuterVolumeSpecName: "kube-api-access-zvvhm") pod "3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" (UID: "3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5"). InnerVolumeSpecName "kube-api-access-zvvhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.320142 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" (UID: "3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.405056 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvvhm\" (UniqueName: \"kubernetes.io/projected/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-kube-api-access-zvvhm\") on node \"crc\" DevicePath \"\"" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.405098 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.544115 4878 generic.go:334] "Generic (PLEG): container finished" podID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerID="715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5" exitCode=0 Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.544166 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b885s" event={"ID":"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5","Type":"ContainerDied","Data":"715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5"} Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.544202 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b885s" event={"ID":"3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5","Type":"ContainerDied","Data":"2c51cfb99ff739f19a408192f039252e6acdfb50dad3b697a4ef4f553abee0a5"} Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.544224 4878 scope.go:117] "RemoveContainer" containerID="715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.544250 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b885s" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.580659 4878 scope.go:117] "RemoveContainer" containerID="17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.590083 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b885s"] Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.615118 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b885s"] Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.615540 4878 scope.go:117] "RemoveContainer" containerID="2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.672131 4878 scope.go:117] "RemoveContainer" containerID="715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5" Dec 02 19:42:08 crc kubenswrapper[4878]: E1202 19:42:08.672748 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5\": container with ID starting with 715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5 not found: ID does not exist" containerID="715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.672818 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5"} err="failed to get container status \"715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5\": rpc error: code = NotFound desc = could not find container \"715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5\": container with ID starting with 715fc18ea80ac6e83fc31ceeda8d9e095795350b5c7d5caae5ca81c203be0be5 not found: ID does not exist" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.672865 4878 scope.go:117] "RemoveContainer" containerID="17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55" Dec 02 19:42:08 crc kubenswrapper[4878]: E1202 19:42:08.673203 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55\": container with ID starting with 17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55 not found: ID does not exist" containerID="17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.673282 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55"} err="failed to get container status \"17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55\": rpc error: code = NotFound desc = could not find container \"17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55\": container with ID starting with 17560ffa88eaead57f0a4f3bf2304b5140cefaff3db9f9911f36ccded220fe55 not found: ID does not exist" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.673317 4878 scope.go:117] "RemoveContainer" containerID="2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727" Dec 02 19:42:08 crc kubenswrapper[4878]: E1202 19:42:08.673814 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727\": container with ID starting with 2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727 not found: ID does not exist" containerID="2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.673845 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727"} err="failed to get container status \"2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727\": rpc error: code = NotFound desc = could not find container \"2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727\": container with ID starting with 2ecbb58d3016cfee6a2e79f4c7da636bd20996f349f5cf165dff38cf0311f727 not found: ID does not exist" Dec 02 19:42:08 crc kubenswrapper[4878]: I1202 19:42:08.951396 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" path="/var/lib/kubelet/pods/3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5/volumes" Dec 02 19:42:23 crc kubenswrapper[4878]: I1202 19:42:23.742597 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:42:23 crc kubenswrapper[4878]: I1202 19:42:23.743384 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:42:53 crc kubenswrapper[4878]: I1202 19:42:53.742461 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:42:53 crc kubenswrapper[4878]: I1202 19:42:53.743211 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:42:53 crc kubenswrapper[4878]: I1202 19:42:53.743335 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:42:53 crc kubenswrapper[4878]: I1202 19:42:53.744974 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:42:53 crc kubenswrapper[4878]: I1202 19:42:53.745097 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" gracePeriod=600 Dec 02 19:42:53 crc kubenswrapper[4878]: E1202 19:42:53.888710 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:42:54 crc kubenswrapper[4878]: I1202 19:42:54.204984 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" exitCode=0 Dec 02 19:42:54 crc kubenswrapper[4878]: I1202 19:42:54.205038 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12"} Dec 02 19:42:54 crc kubenswrapper[4878]: I1202 19:42:54.205087 4878 scope.go:117] "RemoveContainer" containerID="10aced38f4d88a43bcff0a819879536205f756760990e6a619e365b4851df7fb" Dec 02 19:42:54 crc kubenswrapper[4878]: I1202 19:42:54.206192 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:42:54 crc kubenswrapper[4878]: E1202 19:42:54.206733 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:43:06 crc kubenswrapper[4878]: I1202 19:43:06.938987 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:43:06 crc kubenswrapper[4878]: E1202 19:43:06.939886 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:43:19 crc kubenswrapper[4878]: I1202 19:43:19.938970 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:43:19 crc kubenswrapper[4878]: E1202 19:43:19.940153 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:43:31 crc kubenswrapper[4878]: I1202 19:43:31.939799 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:43:31 crc kubenswrapper[4878]: E1202 19:43:31.940761 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:43:42 crc kubenswrapper[4878]: I1202 19:43:42.906996 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lq95x"] Dec 02 19:43:42 crc kubenswrapper[4878]: E1202 19:43:42.907914 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="extract-utilities" Dec 02 19:43:42 crc kubenswrapper[4878]: I1202 19:43:42.907928 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="extract-utilities" Dec 02 19:43:42 crc kubenswrapper[4878]: E1202 19:43:42.907944 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="extract-content" Dec 02 19:43:42 crc kubenswrapper[4878]: I1202 19:43:42.907951 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="extract-content" Dec 02 19:43:42 crc kubenswrapper[4878]: E1202 19:43:42.907968 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="registry-server" Dec 02 19:43:42 crc kubenswrapper[4878]: I1202 19:43:42.907975 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="registry-server" Dec 02 19:43:42 crc kubenswrapper[4878]: I1202 19:43:42.908257 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2bb8bb-ea81-4ef0-9ba7-5da6021d8ac5" containerName="registry-server" Dec 02 19:43:42 crc kubenswrapper[4878]: I1202 19:43:42.909854 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:42 crc kubenswrapper[4878]: I1202 19:43:42.957644 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lq95x"] Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.027470 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-catalog-content\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.027685 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49ts\" (UniqueName: \"kubernetes.io/projected/f0bc76d4-e2ae-4f06-8673-08bd5380b944-kube-api-access-j49ts\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.028439 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-utilities\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.131073 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-utilities\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.131334 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-catalog-content\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.131406 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49ts\" (UniqueName: \"kubernetes.io/projected/f0bc76d4-e2ae-4f06-8673-08bd5380b944-kube-api-access-j49ts\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.132663 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-catalog-content\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.132786 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-utilities\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.165018 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49ts\" (UniqueName: \"kubernetes.io/projected/f0bc76d4-e2ae-4f06-8673-08bd5380b944-kube-api-access-j49ts\") pod \"certified-operators-lq95x\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.247718 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:43 crc kubenswrapper[4878]: I1202 19:43:43.796274 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lq95x"] Dec 02 19:43:44 crc kubenswrapper[4878]: I1202 19:43:44.153088 4878 generic.go:334] "Generic (PLEG): container finished" podID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerID="41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed" exitCode=0 Dec 02 19:43:44 crc kubenswrapper[4878]: I1202 19:43:44.153145 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq95x" event={"ID":"f0bc76d4-e2ae-4f06-8673-08bd5380b944","Type":"ContainerDied","Data":"41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed"} Dec 02 19:43:44 crc kubenswrapper[4878]: I1202 19:43:44.153405 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq95x" event={"ID":"f0bc76d4-e2ae-4f06-8673-08bd5380b944","Type":"ContainerStarted","Data":"c8f5d9bcf6873752edd27ce70980412dae6a52e187e72ed498efefcc8789f5a9"} Dec 02 19:43:45 crc kubenswrapper[4878]: I1202 19:43:45.937787 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:43:45 crc kubenswrapper[4878]: E1202 19:43:45.938737 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:43:46 crc kubenswrapper[4878]: I1202 19:43:46.178338 4878 generic.go:334] "Generic (PLEG): container finished" podID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerID="03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a" exitCode=0 Dec 02 19:43:46 crc kubenswrapper[4878]: I1202 19:43:46.178390 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq95x" event={"ID":"f0bc76d4-e2ae-4f06-8673-08bd5380b944","Type":"ContainerDied","Data":"03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a"} Dec 02 19:43:46 crc kubenswrapper[4878]: E1202 19:43:46.414406 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0bc76d4_e2ae_4f06_8673_08bd5380b944.slice/crio-conmon-03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0bc76d4_e2ae_4f06_8673_08bd5380b944.slice/crio-03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a.scope\": RecentStats: unable to find data in memory cache]" Dec 02 19:43:47 crc kubenswrapper[4878]: I1202 19:43:47.191563 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq95x" event={"ID":"f0bc76d4-e2ae-4f06-8673-08bd5380b944","Type":"ContainerStarted","Data":"b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f"} Dec 02 19:43:47 crc kubenswrapper[4878]: I1202 19:43:47.217217 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lq95x" podStartSLOduration=2.6589796100000003 podStartE2EDuration="5.217198462s" podCreationTimestamp="2025-12-02 19:43:42 +0000 UTC" firstStartedPulling="2025-12-02 19:43:44.155439166 +0000 UTC m=+5333.845058047" lastFinishedPulling="2025-12-02 19:43:46.713658018 +0000 UTC m=+5336.403276899" observedRunningTime="2025-12-02 19:43:47.207893472 +0000 UTC m=+5336.897512413" watchObservedRunningTime="2025-12-02 19:43:47.217198462 +0000 UTC m=+5336.906817343" Dec 02 19:43:53 crc kubenswrapper[4878]: I1202 19:43:53.248636 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:53 crc kubenswrapper[4878]: I1202 19:43:53.249152 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:53 crc kubenswrapper[4878]: I1202 19:43:53.325134 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:54 crc kubenswrapper[4878]: I1202 19:43:54.343965 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:55 crc kubenswrapper[4878]: I1202 19:43:55.106743 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lq95x"] Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.281146 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lq95x" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="registry-server" containerID="cri-o://b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f" gracePeriod=2 Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.878397 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.917846 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-utilities\") pod \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.918169 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j49ts\" (UniqueName: \"kubernetes.io/projected/f0bc76d4-e2ae-4f06-8673-08bd5380b944-kube-api-access-j49ts\") pod \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.918273 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-catalog-content\") pod \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\" (UID: \"f0bc76d4-e2ae-4f06-8673-08bd5380b944\") " Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.919943 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-utilities" (OuterVolumeSpecName: "utilities") pod "f0bc76d4-e2ae-4f06-8673-08bd5380b944" (UID: "f0bc76d4-e2ae-4f06-8673-08bd5380b944"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.928455 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0bc76d4-e2ae-4f06-8673-08bd5380b944-kube-api-access-j49ts" (OuterVolumeSpecName: "kube-api-access-j49ts") pod "f0bc76d4-e2ae-4f06-8673-08bd5380b944" (UID: "f0bc76d4-e2ae-4f06-8673-08bd5380b944"). InnerVolumeSpecName "kube-api-access-j49ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.944589 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:43:56 crc kubenswrapper[4878]: E1202 19:43:56.944921 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:43:56 crc kubenswrapper[4878]: I1202 19:43:56.988055 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0bc76d4-e2ae-4f06-8673-08bd5380b944" (UID: "f0bc76d4-e2ae-4f06-8673-08bd5380b944"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.022042 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.022086 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j49ts\" (UniqueName: \"kubernetes.io/projected/f0bc76d4-e2ae-4f06-8673-08bd5380b944-kube-api-access-j49ts\") on node \"crc\" DevicePath \"\"" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.022103 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bc76d4-e2ae-4f06-8673-08bd5380b944-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.322653 4878 generic.go:334] "Generic (PLEG): container finished" podID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerID="b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f" exitCode=0 Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.322715 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq95x" event={"ID":"f0bc76d4-e2ae-4f06-8673-08bd5380b944","Type":"ContainerDied","Data":"b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f"} Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.322952 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq95x" event={"ID":"f0bc76d4-e2ae-4f06-8673-08bd5380b944","Type":"ContainerDied","Data":"c8f5d9bcf6873752edd27ce70980412dae6a52e187e72ed498efefcc8789f5a9"} Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.322981 4878 scope.go:117] "RemoveContainer" containerID="b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.322778 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq95x" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.365653 4878 scope.go:117] "RemoveContainer" containerID="03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.371510 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lq95x"] Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.387153 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lq95x"] Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.394571 4878 scope.go:117] "RemoveContainer" containerID="41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.477699 4878 scope.go:117] "RemoveContainer" containerID="b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f" Dec 02 19:43:57 crc kubenswrapper[4878]: E1202 19:43:57.482479 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f\": container with ID starting with b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f not found: ID does not exist" containerID="b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.482529 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f"} err="failed to get container status \"b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f\": rpc error: code = NotFound desc = could not find container \"b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f\": container with ID starting with b72d0b98f13bd0f305be7b727267cb43a72443cc93d25f4abf78bbe0400b3f6f not found: ID does not exist" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.482554 4878 scope.go:117] "RemoveContainer" containerID="03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a" Dec 02 19:43:57 crc kubenswrapper[4878]: E1202 19:43:57.483160 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a\": container with ID starting with 03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a not found: ID does not exist" containerID="03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.483182 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a"} err="failed to get container status \"03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a\": rpc error: code = NotFound desc = could not find container \"03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a\": container with ID starting with 03bd33dc927fd95453f9145567849d1d9d6794d68c14ec0e68208e871358db4a not found: ID does not exist" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.483196 4878 scope.go:117] "RemoveContainer" containerID="41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed" Dec 02 19:43:57 crc kubenswrapper[4878]: E1202 19:43:57.484481 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed\": container with ID starting with 41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed not found: ID does not exist" containerID="41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed" Dec 02 19:43:57 crc kubenswrapper[4878]: I1202 19:43:57.484512 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed"} err="failed to get container status \"41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed\": rpc error: code = NotFound desc = could not find container \"41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed\": container with ID starting with 41fad9e35204fce9456525fe0df6305b93d8ea29c4aefa65205325b79be384ed not found: ID does not exist" Dec 02 19:43:58 crc kubenswrapper[4878]: I1202 19:43:58.961191 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" path="/var/lib/kubelet/pods/f0bc76d4-e2ae-4f06-8673-08bd5380b944/volumes" Dec 02 19:44:09 crc kubenswrapper[4878]: I1202 19:44:09.938564 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:44:09 crc kubenswrapper[4878]: E1202 19:44:09.939924 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:44:23 crc kubenswrapper[4878]: I1202 19:44:23.938759 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:44:23 crc kubenswrapper[4878]: E1202 19:44:23.940204 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:44:35 crc kubenswrapper[4878]: I1202 19:44:35.941284 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:44:35 crc kubenswrapper[4878]: E1202 19:44:35.943050 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:44:46 crc kubenswrapper[4878]: I1202 19:44:46.938916 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:44:46 crc kubenswrapper[4878]: E1202 19:44:46.940120 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.208410 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd"] Dec 02 19:45:00 crc kubenswrapper[4878]: E1202 19:45:00.209973 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="extract-utilities" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.209998 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="extract-utilities" Dec 02 19:45:00 crc kubenswrapper[4878]: E1202 19:45:00.210014 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="extract-content" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.210026 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="extract-content" Dec 02 19:45:00 crc kubenswrapper[4878]: E1202 19:45:00.210106 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="registry-server" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.210115 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="registry-server" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.210466 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0bc76d4-e2ae-4f06-8673-08bd5380b944" containerName="registry-server" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.211610 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.219013 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.219648 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.227779 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd"] Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.317048 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4vt\" (UniqueName: \"kubernetes.io/projected/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-kube-api-access-js4vt\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.317307 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-config-volume\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.317864 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-secret-volume\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.419795 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-config-volume\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.419989 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-secret-volume\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.420034 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js4vt\" (UniqueName: \"kubernetes.io/projected/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-kube-api-access-js4vt\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.421739 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-config-volume\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.425925 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-secret-volume\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.441634 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4vt\" (UniqueName: \"kubernetes.io/projected/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-kube-api-access-js4vt\") pod \"collect-profiles-29411745-tzmmd\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:00 crc kubenswrapper[4878]: I1202 19:45:00.548412 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:01 crc kubenswrapper[4878]: I1202 19:45:01.008839 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd"] Dec 02 19:45:01 crc kubenswrapper[4878]: W1202 19:45:01.014867 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b721c98_02ff_4ce6_8b0b_c3d1b0463c9b.slice/crio-77dfbd950fa547cf744e980ec1af55971fcd19cd41136b04446959c1be804185 WatchSource:0}: Error finding container 77dfbd950fa547cf744e980ec1af55971fcd19cd41136b04446959c1be804185: Status 404 returned error can't find the container with id 77dfbd950fa547cf744e980ec1af55971fcd19cd41136b04446959c1be804185 Dec 02 19:45:01 crc kubenswrapper[4878]: I1202 19:45:01.179608 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" event={"ID":"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b","Type":"ContainerStarted","Data":"77dfbd950fa547cf744e980ec1af55971fcd19cd41136b04446959c1be804185"} Dec 02 19:45:01 crc kubenswrapper[4878]: I1202 19:45:01.938950 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:45:01 crc kubenswrapper[4878]: E1202 19:45:01.939717 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:45:02 crc kubenswrapper[4878]: I1202 19:45:02.192381 4878 generic.go:334] "Generic (PLEG): container finished" podID="7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b" containerID="0c2fbb191fc545fc60a5ce5906d129aaf98eaaeac8e7c1971c4f5785ff8bbade" exitCode=0 Dec 02 19:45:02 crc kubenswrapper[4878]: I1202 19:45:02.192431 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" event={"ID":"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b","Type":"ContainerDied","Data":"0c2fbb191fc545fc60a5ce5906d129aaf98eaaeac8e7c1971c4f5785ff8bbade"} Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.678616 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.805882 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js4vt\" (UniqueName: \"kubernetes.io/projected/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-kube-api-access-js4vt\") pod \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.806138 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-secret-volume\") pod \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.807060 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-config-volume\") pod \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\" (UID: \"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b\") " Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.807645 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b" (UID: "7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.808090 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.812472 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-kube-api-access-js4vt" (OuterVolumeSpecName: "kube-api-access-js4vt") pod "7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b" (UID: "7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b"). InnerVolumeSpecName "kube-api-access-js4vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.812558 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b" (UID: "7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.910911 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:45:03 crc kubenswrapper[4878]: I1202 19:45:03.910952 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js4vt\" (UniqueName: \"kubernetes.io/projected/7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b-kube-api-access-js4vt\") on node \"crc\" DevicePath \"\"" Dec 02 19:45:04 crc kubenswrapper[4878]: I1202 19:45:04.218026 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" event={"ID":"7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b","Type":"ContainerDied","Data":"77dfbd950fa547cf744e980ec1af55971fcd19cd41136b04446959c1be804185"} Dec 02 19:45:04 crc kubenswrapper[4878]: I1202 19:45:04.218079 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77dfbd950fa547cf744e980ec1af55971fcd19cd41136b04446959c1be804185" Dec 02 19:45:04 crc kubenswrapper[4878]: I1202 19:45:04.218100 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411745-tzmmd" Dec 02 19:45:04 crc kubenswrapper[4878]: I1202 19:45:04.767601 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd"] Dec 02 19:45:04 crc kubenswrapper[4878]: I1202 19:45:04.786677 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411700-swkbd"] Dec 02 19:45:04 crc kubenswrapper[4878]: I1202 19:45:04.965744 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d1d37d-a8b0-470c-b89a-4873b5253f38" path="/var/lib/kubelet/pods/97d1d37d-a8b0-470c-b89a-4873b5253f38/volumes" Dec 02 19:45:06 crc kubenswrapper[4878]: I1202 19:45:06.170677 4878 scope.go:117] "RemoveContainer" containerID="d5b5533a2c783737e8b175228613fd7c6b4bfca6360f7ee5b9e1dbed487762eb" Dec 02 19:45:14 crc kubenswrapper[4878]: I1202 19:45:14.939795 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:45:14 crc kubenswrapper[4878]: E1202 19:45:14.940846 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:45:25 crc kubenswrapper[4878]: I1202 19:45:25.938213 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:45:25 crc kubenswrapper[4878]: E1202 19:45:25.939508 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:45:38 crc kubenswrapper[4878]: I1202 19:45:38.938479 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:45:38 crc kubenswrapper[4878]: E1202 19:45:38.939143 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:45:50 crc kubenswrapper[4878]: I1202 19:45:50.948118 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:45:50 crc kubenswrapper[4878]: E1202 19:45:50.949679 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:46:04 crc kubenswrapper[4878]: I1202 19:46:04.939582 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:46:04 crc kubenswrapper[4878]: E1202 19:46:04.940522 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:46:15 crc kubenswrapper[4878]: I1202 19:46:15.938847 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:46:15 crc kubenswrapper[4878]: E1202 19:46:15.939911 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:46:29 crc kubenswrapper[4878]: I1202 19:46:29.937765 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:46:29 crc kubenswrapper[4878]: E1202 19:46:29.938734 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:46:41 crc kubenswrapper[4878]: I1202 19:46:41.938039 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:46:41 crc kubenswrapper[4878]: E1202 19:46:41.939001 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:46:56 crc kubenswrapper[4878]: I1202 19:46:56.939095 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:46:56 crc kubenswrapper[4878]: E1202 19:46:56.940007 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:47:08 crc kubenswrapper[4878]: I1202 19:47:08.938520 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:47:08 crc kubenswrapper[4878]: E1202 19:47:08.939546 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:47:19 crc kubenswrapper[4878]: I1202 19:47:19.938806 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:47:19 crc kubenswrapper[4878]: E1202 19:47:19.940362 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:47:34 crc kubenswrapper[4878]: I1202 19:47:34.938298 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:47:34 crc kubenswrapper[4878]: E1202 19:47:34.939115 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:47:46 crc kubenswrapper[4878]: I1202 19:47:46.943685 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:47:46 crc kubenswrapper[4878]: E1202 19:47:46.945146 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:48:00 crc kubenswrapper[4878]: I1202 19:48:00.956493 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:48:01 crc kubenswrapper[4878]: I1202 19:48:01.527819 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"fb932a3ddd875b5f5a1d387638664611ad7b68fb4301c89bc7ca42c4dbff3eda"} Dec 02 19:50:23 crc kubenswrapper[4878]: I1202 19:50:23.742583 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:50:23 crc kubenswrapper[4878]: I1202 19:50:23.743753 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:50:53 crc kubenswrapper[4878]: I1202 19:50:53.742210 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:50:53 crc kubenswrapper[4878]: I1202 19:50:53.744492 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.359146 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-drgmn"] Dec 02 19:51:17 crc kubenswrapper[4878]: E1202 19:51:17.360355 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b" containerName="collect-profiles" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.360377 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b" containerName="collect-profiles" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.360659 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b721c98-02ff-4ce6-8b0b-c3d1b0463c9b" containerName="collect-profiles" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.363737 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.374764 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drgmn"] Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.496566 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-utilities\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.496873 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxx76\" (UniqueName: \"kubernetes.io/projected/21dd01e0-2208-4146-9ae1-8270c28cb783-kube-api-access-lxx76\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.497211 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-catalog-content\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.600251 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxx76\" (UniqueName: \"kubernetes.io/projected/21dd01e0-2208-4146-9ae1-8270c28cb783-kube-api-access-lxx76\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.600899 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-catalog-content\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.601107 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-utilities\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.601753 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-utilities\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.602078 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-catalog-content\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.619078 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxx76\" (UniqueName: \"kubernetes.io/projected/21dd01e0-2208-4146-9ae1-8270c28cb783-kube-api-access-lxx76\") pod \"redhat-operators-drgmn\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:17 crc kubenswrapper[4878]: I1202 19:51:17.700790 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:18 crc kubenswrapper[4878]: I1202 19:51:18.231493 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drgmn"] Dec 02 19:51:18 crc kubenswrapper[4878]: I1202 19:51:18.390651 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drgmn" event={"ID":"21dd01e0-2208-4146-9ae1-8270c28cb783","Type":"ContainerStarted","Data":"31b0abdb81fefc5eddfbfca343e0dd9b691c1c229fd669907b310858b6c942cf"} Dec 02 19:51:19 crc kubenswrapper[4878]: I1202 19:51:19.407602 4878 generic.go:334] "Generic (PLEG): container finished" podID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerID="e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379" exitCode=0 Dec 02 19:51:19 crc kubenswrapper[4878]: I1202 19:51:19.407681 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drgmn" event={"ID":"21dd01e0-2208-4146-9ae1-8270c28cb783","Type":"ContainerDied","Data":"e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379"} Dec 02 19:51:19 crc kubenswrapper[4878]: I1202 19:51:19.411126 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:51:21 crc kubenswrapper[4878]: I1202 19:51:21.509793 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drgmn" event={"ID":"21dd01e0-2208-4146-9ae1-8270c28cb783","Type":"ContainerStarted","Data":"1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7"} Dec 02 19:51:23 crc kubenswrapper[4878]: I1202 19:51:23.742732 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:51:23 crc kubenswrapper[4878]: I1202 19:51:23.743179 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:51:23 crc kubenswrapper[4878]: I1202 19:51:23.743219 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:51:23 crc kubenswrapper[4878]: I1202 19:51:23.744135 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb932a3ddd875b5f5a1d387638664611ad7b68fb4301c89bc7ca42c4dbff3eda"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:51:23 crc kubenswrapper[4878]: I1202 19:51:23.744186 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://fb932a3ddd875b5f5a1d387638664611ad7b68fb4301c89bc7ca42c4dbff3eda" gracePeriod=600 Dec 02 19:51:24 crc kubenswrapper[4878]: I1202 19:51:24.546819 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="fb932a3ddd875b5f5a1d387638664611ad7b68fb4301c89bc7ca42c4dbff3eda" exitCode=0 Dec 02 19:51:24 crc kubenswrapper[4878]: I1202 19:51:24.546899 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"fb932a3ddd875b5f5a1d387638664611ad7b68fb4301c89bc7ca42c4dbff3eda"} Dec 02 19:51:24 crc kubenswrapper[4878]: I1202 19:51:24.547208 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25"} Dec 02 19:51:24 crc kubenswrapper[4878]: I1202 19:51:24.547260 4878 scope.go:117] "RemoveContainer" containerID="22c519ce667408c904edb1355b6a89191f1c3bcb30cc3411b93937dbf6636e12" Dec 02 19:51:24 crc kubenswrapper[4878]: I1202 19:51:24.550348 4878 generic.go:334] "Generic (PLEG): container finished" podID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerID="1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7" exitCode=0 Dec 02 19:51:24 crc kubenswrapper[4878]: I1202 19:51:24.550392 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drgmn" event={"ID":"21dd01e0-2208-4146-9ae1-8270c28cb783","Type":"ContainerDied","Data":"1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7"} Dec 02 19:51:25 crc kubenswrapper[4878]: I1202 19:51:25.563419 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drgmn" event={"ID":"21dd01e0-2208-4146-9ae1-8270c28cb783","Type":"ContainerStarted","Data":"de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e"} Dec 02 19:51:25 crc kubenswrapper[4878]: I1202 19:51:25.598716 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-drgmn" podStartSLOduration=3.06111824 podStartE2EDuration="8.598697798s" podCreationTimestamp="2025-12-02 19:51:17 +0000 UTC" firstStartedPulling="2025-12-02 19:51:19.410862849 +0000 UTC m=+5789.100481740" lastFinishedPulling="2025-12-02 19:51:24.948442417 +0000 UTC m=+5794.638061298" observedRunningTime="2025-12-02 19:51:25.587589901 +0000 UTC m=+5795.277208792" watchObservedRunningTime="2025-12-02 19:51:25.598697798 +0000 UTC m=+5795.288316679" Dec 02 19:51:27 crc kubenswrapper[4878]: I1202 19:51:27.701356 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:27 crc kubenswrapper[4878]: I1202 19:51:27.701994 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:28 crc kubenswrapper[4878]: I1202 19:51:28.763261 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-drgmn" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="registry-server" probeResult="failure" output=< Dec 02 19:51:28 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:51:28 crc kubenswrapper[4878]: > Dec 02 19:51:37 crc kubenswrapper[4878]: I1202 19:51:37.751499 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:37 crc kubenswrapper[4878]: I1202 19:51:37.791701 4878 generic.go:334] "Generic (PLEG): container finished" podID="e5a4096d-c2be-4987-b0eb-fb47da8a9703" containerID="107e695fc6966968c956106bdc95595e99604d836e9d7c2737f2f8933ee5b8dd" exitCode=0 Dec 02 19:51:37 crc kubenswrapper[4878]: I1202 19:51:37.791796 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e5a4096d-c2be-4987-b0eb-fb47da8a9703","Type":"ContainerDied","Data":"107e695fc6966968c956106bdc95595e99604d836e9d7c2737f2f8933ee5b8dd"} Dec 02 19:51:37 crc kubenswrapper[4878]: I1202 19:51:37.808979 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:37 crc kubenswrapper[4878]: I1202 19:51:37.993899 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drgmn"] Dec 02 19:51:38 crc kubenswrapper[4878]: I1202 19:51:38.808457 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-drgmn" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="registry-server" containerID="cri-o://de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e" gracePeriod=2 Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.380062 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.392030 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.486442 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ssh-key\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.486573 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-workdir\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.486675 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.487212 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ca-certs\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.487429 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.488161 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-temporary\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.488219 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxx76\" (UniqueName: \"kubernetes.io/projected/21dd01e0-2208-4146-9ae1-8270c28cb783-kube-api-access-lxx76\") pod \"21dd01e0-2208-4146-9ae1-8270c28cb783\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.488266 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config-secret\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.488297 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-config-data\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.488327 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbwfv\" (UniqueName: \"kubernetes.io/projected/e5a4096d-c2be-4987-b0eb-fb47da8a9703-kube-api-access-rbwfv\") pod \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\" (UID: \"e5a4096d-c2be-4987-b0eb-fb47da8a9703\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.489032 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.490019 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-config-data" (OuterVolumeSpecName: "config-data") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.494529 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.495877 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21dd01e0-2208-4146-9ae1-8270c28cb783-kube-api-access-lxx76" (OuterVolumeSpecName: "kube-api-access-lxx76") pod "21dd01e0-2208-4146-9ae1-8270c28cb783" (UID: "21dd01e0-2208-4146-9ae1-8270c28cb783"). InnerVolumeSpecName "kube-api-access-lxx76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.499157 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.500280 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a4096d-c2be-4987-b0eb-fb47da8a9703-kube-api-access-rbwfv" (OuterVolumeSpecName: "kube-api-access-rbwfv") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "kube-api-access-rbwfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.524754 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.531544 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.547522 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.575341 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e5a4096d-c2be-4987-b0eb-fb47da8a9703" (UID: "e5a4096d-c2be-4987-b0eb-fb47da8a9703"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.590359 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-utilities\") pod \"21dd01e0-2208-4146-9ae1-8270c28cb783\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.590454 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-catalog-content\") pod \"21dd01e0-2208-4146-9ae1-8270c28cb783\" (UID: \"21dd01e0-2208-4146-9ae1-8270c28cb783\") " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591033 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591051 4878 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591317 4878 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591338 4878 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591351 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxx76\" (UniqueName: \"kubernetes.io/projected/21dd01e0-2208-4146-9ae1-8270c28cb783-kube-api-access-lxx76\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591360 4878 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591368 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a4096d-c2be-4987-b0eb-fb47da8a9703-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591376 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbwfv\" (UniqueName: \"kubernetes.io/projected/e5a4096d-c2be-4987-b0eb-fb47da8a9703-kube-api-access-rbwfv\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591383 4878 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e5a4096d-c2be-4987-b0eb-fb47da8a9703-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591395 4878 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e5a4096d-c2be-4987-b0eb-fb47da8a9703-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.591769 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-utilities" (OuterVolumeSpecName: "utilities") pod "21dd01e0-2208-4146-9ae1-8270c28cb783" (UID: "21dd01e0-2208-4146-9ae1-8270c28cb783"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.617867 4878 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.686547 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21dd01e0-2208-4146-9ae1-8270c28cb783" (UID: "21dd01e0-2208-4146-9ae1-8270c28cb783"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.694103 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.694158 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21dd01e0-2208-4146-9ae1-8270c28cb783-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.694173 4878 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.828807 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.828829 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e5a4096d-c2be-4987-b0eb-fb47da8a9703","Type":"ContainerDied","Data":"94720cc681eb1f953476fef8ed625a6e7a0ec4d5008748a4ce9fd232177347d1"} Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.828904 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94720cc681eb1f953476fef8ed625a6e7a0ec4d5008748a4ce9fd232177347d1" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.833159 4878 generic.go:334] "Generic (PLEG): container finished" podID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerID="de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e" exitCode=0 Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.833200 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drgmn" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.833249 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drgmn" event={"ID":"21dd01e0-2208-4146-9ae1-8270c28cb783","Type":"ContainerDied","Data":"de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e"} Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.833291 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drgmn" event={"ID":"21dd01e0-2208-4146-9ae1-8270c28cb783","Type":"ContainerDied","Data":"31b0abdb81fefc5eddfbfca343e0dd9b691c1c229fd669907b310858b6c942cf"} Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.833312 4878 scope.go:117] "RemoveContainer" containerID="de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.866603 4878 scope.go:117] "RemoveContainer" containerID="1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.894450 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drgmn"] Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.904853 4878 scope.go:117] "RemoveContainer" containerID="e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.904857 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-drgmn"] Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.923274 4878 scope.go:117] "RemoveContainer" containerID="de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e" Dec 02 19:51:39 crc kubenswrapper[4878]: E1202 19:51:39.923640 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e\": container with ID starting with de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e not found: ID does not exist" containerID="de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.923674 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e"} err="failed to get container status \"de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e\": rpc error: code = NotFound desc = could not find container \"de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e\": container with ID starting with de557d4eb9e23989e1b947dffe8d8d677fdf3de3f9dd2015d169593d9893678e not found: ID does not exist" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.923696 4878 scope.go:117] "RemoveContainer" containerID="1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7" Dec 02 19:51:39 crc kubenswrapper[4878]: E1202 19:51:39.923912 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7\": container with ID starting with 1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7 not found: ID does not exist" containerID="1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.923933 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7"} err="failed to get container status \"1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7\": rpc error: code = NotFound desc = could not find container \"1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7\": container with ID starting with 1fcc5c49bcc65e348aaa5626a97982d953a54361d6d4db0280a3a893ab155fa7 not found: ID does not exist" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.923947 4878 scope.go:117] "RemoveContainer" containerID="e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379" Dec 02 19:51:39 crc kubenswrapper[4878]: E1202 19:51:39.924153 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379\": container with ID starting with e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379 not found: ID does not exist" containerID="e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379" Dec 02 19:51:39 crc kubenswrapper[4878]: I1202 19:51:39.924177 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379"} err="failed to get container status \"e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379\": rpc error: code = NotFound desc = could not find container \"e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379\": container with ID starting with e1f3f19a8882b3a7e9445a9682593fec8238a7572a22f87f8a7b232c12bc8379 not found: ID does not exist" Dec 02 19:51:40 crc kubenswrapper[4878]: I1202 19:51:40.956873 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" path="/var/lib/kubelet/pods/21dd01e0-2208-4146-9ae1-8270c28cb783/volumes" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.917127 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 19:51:46 crc kubenswrapper[4878]: E1202 19:51:46.919532 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a4096d-c2be-4987-b0eb-fb47da8a9703" containerName="tempest-tests-tempest-tests-runner" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.919654 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a4096d-c2be-4987-b0eb-fb47da8a9703" containerName="tempest-tests-tempest-tests-runner" Dec 02 19:51:46 crc kubenswrapper[4878]: E1202 19:51:46.919762 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="registry-server" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.919846 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="registry-server" Dec 02 19:51:46 crc kubenswrapper[4878]: E1202 19:51:46.919925 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="extract-content" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.920001 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="extract-content" Dec 02 19:51:46 crc kubenswrapper[4878]: E1202 19:51:46.920088 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="extract-utilities" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.920210 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="extract-utilities" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.920697 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a4096d-c2be-4987-b0eb-fb47da8a9703" containerName="tempest-tests-tempest-tests-runner" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.920822 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dd01e0-2208-4146-9ae1-8270c28cb783" containerName="registry-server" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.922490 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.926791 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vztps" Dec 02 19:51:46 crc kubenswrapper[4878]: I1202 19:51:46.934315 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.034931 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz7q\" (UniqueName: \"kubernetes.io/projected/18ccf713-5eef-4b4b-b1ea-9f3b34639edb-kube-api-access-mtz7q\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"18ccf713-5eef-4b4b-b1ea-9f3b34639edb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.035224 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"18ccf713-5eef-4b4b-b1ea-9f3b34639edb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.136971 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz7q\" (UniqueName: \"kubernetes.io/projected/18ccf713-5eef-4b4b-b1ea-9f3b34639edb-kube-api-access-mtz7q\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"18ccf713-5eef-4b4b-b1ea-9f3b34639edb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.137043 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"18ccf713-5eef-4b4b-b1ea-9f3b34639edb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.138861 4878 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"18ccf713-5eef-4b4b-b1ea-9f3b34639edb\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.159141 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz7q\" (UniqueName: \"kubernetes.io/projected/18ccf713-5eef-4b4b-b1ea-9f3b34639edb-kube-api-access-mtz7q\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"18ccf713-5eef-4b4b-b1ea-9f3b34639edb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.186998 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"18ccf713-5eef-4b4b-b1ea-9f3b34639edb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.259939 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 19:51:47 crc kubenswrapper[4878]: I1202 19:51:47.739987 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 19:51:48 crc kubenswrapper[4878]: I1202 19:51:48.966375 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"18ccf713-5eef-4b4b-b1ea-9f3b34639edb","Type":"ContainerStarted","Data":"c86222af52028fc4922a9178bdcd05d515f0840df2bd82fdd84fb68d9f341f0b"} Dec 02 19:51:49 crc kubenswrapper[4878]: I1202 19:51:49.980761 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"18ccf713-5eef-4b4b-b1ea-9f3b34639edb","Type":"ContainerStarted","Data":"7ace6e7f8d55b3da8db6498a2e023a0c990155997b1e26b18db0f888aa1b350b"} Dec 02 19:51:50 crc kubenswrapper[4878]: I1202 19:51:50.000954 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.063504774 podStartE2EDuration="4.00093679s" podCreationTimestamp="2025-12-02 19:51:46 +0000 UTC" firstStartedPulling="2025-12-02 19:51:48.291340854 +0000 UTC m=+5817.980959735" lastFinishedPulling="2025-12-02 19:51:49.22877287 +0000 UTC m=+5818.918391751" observedRunningTime="2025-12-02 19:51:50.000525147 +0000 UTC m=+5819.690144038" watchObservedRunningTime="2025-12-02 19:51:50.00093679 +0000 UTC m=+5819.690555671" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.315346 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rbgfj"] Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.333892 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.342465 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbgfj"] Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.446543 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-catalog-content\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.446660 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nk44\" (UniqueName: \"kubernetes.io/projected/122dd60a-3b2c-4475-81b2-62db21a607c7-kube-api-access-8nk44\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.446721 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-utilities\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.549683 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-catalog-content\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.549785 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nk44\" (UniqueName: \"kubernetes.io/projected/122dd60a-3b2c-4475-81b2-62db21a607c7-kube-api-access-8nk44\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.549828 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-utilities\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.550272 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-catalog-content\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.550272 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-utilities\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.968625 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nk44\" (UniqueName: \"kubernetes.io/projected/122dd60a-3b2c-4475-81b2-62db21a607c7-kube-api-access-8nk44\") pod \"redhat-marketplace-rbgfj\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:13 crc kubenswrapper[4878]: I1202 19:52:13.986890 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:14 crc kubenswrapper[4878]: I1202 19:52:14.485756 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbgfj"] Dec 02 19:52:14 crc kubenswrapper[4878]: W1202 19:52:14.488651 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod122dd60a_3b2c_4475_81b2_62db21a607c7.slice/crio-069a06dc3055c163751e5cf9af75eb2c0ef09bc98c232fafc43f2cb327dd5146 WatchSource:0}: Error finding container 069a06dc3055c163751e5cf9af75eb2c0ef09bc98c232fafc43f2cb327dd5146: Status 404 returned error can't find the container with id 069a06dc3055c163751e5cf9af75eb2c0ef09bc98c232fafc43f2cb327dd5146 Dec 02 19:52:15 crc kubenswrapper[4878]: I1202 19:52:15.290002 4878 generic.go:334] "Generic (PLEG): container finished" podID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerID="b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306" exitCode=0 Dec 02 19:52:15 crc kubenswrapper[4878]: I1202 19:52:15.290498 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbgfj" event={"ID":"122dd60a-3b2c-4475-81b2-62db21a607c7","Type":"ContainerDied","Data":"b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306"} Dec 02 19:52:15 crc kubenswrapper[4878]: I1202 19:52:15.290553 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbgfj" event={"ID":"122dd60a-3b2c-4475-81b2-62db21a607c7","Type":"ContainerStarted","Data":"069a06dc3055c163751e5cf9af75eb2c0ef09bc98c232fafc43f2cb327dd5146"} Dec 02 19:52:16 crc kubenswrapper[4878]: E1202 19:52:16.970665 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod122dd60a_3b2c_4475_81b2_62db21a607c7.slice/crio-conmon-e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761.scope\": RecentStats: unable to find data in memory cache]" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.317196 4878 generic.go:334] "Generic (PLEG): container finished" podID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerID="e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761" exitCode=0 Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.317254 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbgfj" event={"ID":"122dd60a-3b2c-4475-81b2-62db21a607c7","Type":"ContainerDied","Data":"e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761"} Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.516636 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qmbf/must-gather-ntpcx"] Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.518769 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.520867 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6qmbf"/"default-dockercfg-8l5qw" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.527570 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6qmbf"/"openshift-service-ca.crt" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.527763 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6qmbf"/"kube-root-ca.crt" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.547610 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6qmbf/must-gather-ntpcx"] Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.553832 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w98vj\" (UniqueName: \"kubernetes.io/projected/8810f12f-a3e8-4168-bc6f-75395a2845fb-kube-api-access-w98vj\") pod \"must-gather-ntpcx\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.554060 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8810f12f-a3e8-4168-bc6f-75395a2845fb-must-gather-output\") pod \"must-gather-ntpcx\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.656420 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8810f12f-a3e8-4168-bc6f-75395a2845fb-must-gather-output\") pod \"must-gather-ntpcx\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.656531 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w98vj\" (UniqueName: \"kubernetes.io/projected/8810f12f-a3e8-4168-bc6f-75395a2845fb-kube-api-access-w98vj\") pod \"must-gather-ntpcx\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.656945 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8810f12f-a3e8-4168-bc6f-75395a2845fb-must-gather-output\") pod \"must-gather-ntpcx\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.697569 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w98vj\" (UniqueName: \"kubernetes.io/projected/8810f12f-a3e8-4168-bc6f-75395a2845fb-kube-api-access-w98vj\") pod \"must-gather-ntpcx\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:17 crc kubenswrapper[4878]: I1202 19:52:17.848877 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:52:18 crc kubenswrapper[4878]: I1202 19:52:18.415671 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6qmbf/must-gather-ntpcx"] Dec 02 19:52:19 crc kubenswrapper[4878]: I1202 19:52:19.344941 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbgfj" event={"ID":"122dd60a-3b2c-4475-81b2-62db21a607c7","Type":"ContainerStarted","Data":"3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac"} Dec 02 19:52:19 crc kubenswrapper[4878]: I1202 19:52:19.352859 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" event={"ID":"8810f12f-a3e8-4168-bc6f-75395a2845fb","Type":"ContainerStarted","Data":"bbf7f3350f80dc183771aa979f7a07535b27f052f8a014bfc2bd492abf892ecb"} Dec 02 19:52:19 crc kubenswrapper[4878]: I1202 19:52:19.375468 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rbgfj" podStartSLOduration=3.27919428 podStartE2EDuration="6.375450377s" podCreationTimestamp="2025-12-02 19:52:13 +0000 UTC" firstStartedPulling="2025-12-02 19:52:15.294599002 +0000 UTC m=+5844.984217923" lastFinishedPulling="2025-12-02 19:52:18.390855139 +0000 UTC m=+5848.080474020" observedRunningTime="2025-12-02 19:52:19.370555234 +0000 UTC m=+5849.060174125" watchObservedRunningTime="2025-12-02 19:52:19.375450377 +0000 UTC m=+5849.065069258" Dec 02 19:52:23 crc kubenswrapper[4878]: I1202 19:52:23.987579 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:23 crc kubenswrapper[4878]: I1202 19:52:23.988363 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:24 crc kubenswrapper[4878]: I1202 19:52:24.060714 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:24 crc kubenswrapper[4878]: I1202 19:52:24.409994 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" event={"ID":"8810f12f-a3e8-4168-bc6f-75395a2845fb","Type":"ContainerStarted","Data":"86ff891616fd4e933569d0705663ae73cbdcaa6d3674f7027ff135016ffaf279"} Dec 02 19:52:24 crc kubenswrapper[4878]: I1202 19:52:24.411450 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" event={"ID":"8810f12f-a3e8-4168-bc6f-75395a2845fb","Type":"ContainerStarted","Data":"56c8b049e11d9d998df776c72449d4b7e241e1ffebcdc6279bde6285bbd5ba73"} Dec 02 19:52:24 crc kubenswrapper[4878]: I1202 19:52:24.431826 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" podStartSLOduration=2.283023743 podStartE2EDuration="7.431804555s" podCreationTimestamp="2025-12-02 19:52:17 +0000 UTC" firstStartedPulling="2025-12-02 19:52:18.419506013 +0000 UTC m=+5848.109124894" lastFinishedPulling="2025-12-02 19:52:23.568286825 +0000 UTC m=+5853.257905706" observedRunningTime="2025-12-02 19:52:24.424534238 +0000 UTC m=+5854.114153129" watchObservedRunningTime="2025-12-02 19:52:24.431804555 +0000 UTC m=+5854.121423446" Dec 02 19:52:24 crc kubenswrapper[4878]: I1202 19:52:24.466502 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:24 crc kubenswrapper[4878]: I1202 19:52:24.522762 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbgfj"] Dec 02 19:52:26 crc kubenswrapper[4878]: I1202 19:52:26.431046 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rbgfj" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="registry-server" containerID="cri-o://3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac" gracePeriod=2 Dec 02 19:52:26 crc kubenswrapper[4878]: I1202 19:52:26.938410 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.031646 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-catalog-content\") pod \"122dd60a-3b2c-4475-81b2-62db21a607c7\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.031782 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nk44\" (UniqueName: \"kubernetes.io/projected/122dd60a-3b2c-4475-81b2-62db21a607c7-kube-api-access-8nk44\") pod \"122dd60a-3b2c-4475-81b2-62db21a607c7\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.031996 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-utilities\") pod \"122dd60a-3b2c-4475-81b2-62db21a607c7\" (UID: \"122dd60a-3b2c-4475-81b2-62db21a607c7\") " Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.032789 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-utilities" (OuterVolumeSpecName: "utilities") pod "122dd60a-3b2c-4475-81b2-62db21a607c7" (UID: "122dd60a-3b2c-4475-81b2-62db21a607c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.033746 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.064609 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122dd60a-3b2c-4475-81b2-62db21a607c7-kube-api-access-8nk44" (OuterVolumeSpecName: "kube-api-access-8nk44") pod "122dd60a-3b2c-4475-81b2-62db21a607c7" (UID: "122dd60a-3b2c-4475-81b2-62db21a607c7"). InnerVolumeSpecName "kube-api-access-8nk44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.092602 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "122dd60a-3b2c-4475-81b2-62db21a607c7" (UID: "122dd60a-3b2c-4475-81b2-62db21a607c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.138030 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122dd60a-3b2c-4475-81b2-62db21a607c7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.138066 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nk44\" (UniqueName: \"kubernetes.io/projected/122dd60a-3b2c-4475-81b2-62db21a607c7-kube-api-access-8nk44\") on node \"crc\" DevicePath \"\"" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.443611 4878 generic.go:334] "Generic (PLEG): container finished" podID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerID="3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac" exitCode=0 Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.443672 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbgfj" event={"ID":"122dd60a-3b2c-4475-81b2-62db21a607c7","Type":"ContainerDied","Data":"3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac"} Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.443980 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbgfj" event={"ID":"122dd60a-3b2c-4475-81b2-62db21a607c7","Type":"ContainerDied","Data":"069a06dc3055c163751e5cf9af75eb2c0ef09bc98c232fafc43f2cb327dd5146"} Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.444006 4878 scope.go:117] "RemoveContainer" containerID="3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.443702 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbgfj" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.469135 4878 scope.go:117] "RemoveContainer" containerID="e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.481461 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbgfj"] Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.494196 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbgfj"] Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.533859 4878 scope.go:117] "RemoveContainer" containerID="b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.566140 4878 scope.go:117] "RemoveContainer" containerID="3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac" Dec 02 19:52:27 crc kubenswrapper[4878]: E1202 19:52:27.566670 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac\": container with ID starting with 3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac not found: ID does not exist" containerID="3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.566701 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac"} err="failed to get container status \"3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac\": rpc error: code = NotFound desc = could not find container \"3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac\": container with ID starting with 3a4a38feabc3ce8e262eb56c5b2b629197bb1abd00c14ef695c47553427075ac not found: ID does not exist" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.566721 4878 scope.go:117] "RemoveContainer" containerID="e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761" Dec 02 19:52:27 crc kubenswrapper[4878]: E1202 19:52:27.567018 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761\": container with ID starting with e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761 not found: ID does not exist" containerID="e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.567043 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761"} err="failed to get container status \"e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761\": rpc error: code = NotFound desc = could not find container \"e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761\": container with ID starting with e67c83d8395aa832801653478cd0b8ede578ab399df0e7eac1e2067fad757761 not found: ID does not exist" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.567057 4878 scope.go:117] "RemoveContainer" containerID="b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306" Dec 02 19:52:27 crc kubenswrapper[4878]: E1202 19:52:27.570913 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306\": container with ID starting with b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306 not found: ID does not exist" containerID="b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306" Dec 02 19:52:27 crc kubenswrapper[4878]: I1202 19:52:27.570949 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306"} err="failed to get container status \"b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306\": rpc error: code = NotFound desc = could not find container \"b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306\": container with ID starting with b8317aa00f9678b224b5c6f8b66b22d224e4f96ef6abba5512d70191ee546306 not found: ID does not exist" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.726257 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-k8tzq"] Dec 02 19:52:28 crc kubenswrapper[4878]: E1202 19:52:28.727321 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="extract-content" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.727336 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="extract-content" Dec 02 19:52:28 crc kubenswrapper[4878]: E1202 19:52:28.727413 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="registry-server" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.727420 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="registry-server" Dec 02 19:52:28 crc kubenswrapper[4878]: E1202 19:52:28.727433 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="extract-utilities" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.727442 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="extract-utilities" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.727693 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" containerName="registry-server" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.728546 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.777164 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x745g\" (UniqueName: \"kubernetes.io/projected/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-kube-api-access-x745g\") pod \"crc-debug-k8tzq\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.777390 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-host\") pod \"crc-debug-k8tzq\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.880082 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-host\") pod \"crc-debug-k8tzq\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.880360 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x745g\" (UniqueName: \"kubernetes.io/projected/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-kube-api-access-x745g\") pod \"crc-debug-k8tzq\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.880533 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-host\") pod \"crc-debug-k8tzq\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.915215 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x745g\" (UniqueName: \"kubernetes.io/projected/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-kube-api-access-x745g\") pod \"crc-debug-k8tzq\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:28 crc kubenswrapper[4878]: I1202 19:52:28.951028 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122dd60a-3b2c-4475-81b2-62db21a607c7" path="/var/lib/kubelet/pods/122dd60a-3b2c-4475-81b2-62db21a607c7/volumes" Dec 02 19:52:29 crc kubenswrapper[4878]: I1202 19:52:29.048743 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:52:29 crc kubenswrapper[4878]: W1202 19:52:29.091895 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1d55bf1_b176_4c33_9ad7_7ece51f3a789.slice/crio-57884d49e1d10e7c8ff2cf1e5584f120d4172668a4944067d019ff013ab5d8dc WatchSource:0}: Error finding container 57884d49e1d10e7c8ff2cf1e5584f120d4172668a4944067d019ff013ab5d8dc: Status 404 returned error can't find the container with id 57884d49e1d10e7c8ff2cf1e5584f120d4172668a4944067d019ff013ab5d8dc Dec 02 19:52:29 crc kubenswrapper[4878]: I1202 19:52:29.467055 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" event={"ID":"b1d55bf1-b176-4c33-9ad7-7ece51f3a789","Type":"ContainerStarted","Data":"57884d49e1d10e7c8ff2cf1e5584f120d4172668a4944067d019ff013ab5d8dc"} Dec 02 19:52:42 crc kubenswrapper[4878]: I1202 19:52:42.613406 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" event={"ID":"b1d55bf1-b176-4c33-9ad7-7ece51f3a789","Type":"ContainerStarted","Data":"a69741bea84b499ffbb9d201067cb0fffbb85e936d4d3bb5b995d140309f523a"} Dec 02 19:52:42 crc kubenswrapper[4878]: I1202 19:52:42.641575 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" podStartSLOduration=1.716690133 podStartE2EDuration="14.641553764s" podCreationTimestamp="2025-12-02 19:52:28 +0000 UTC" firstStartedPulling="2025-12-02 19:52:29.096929508 +0000 UTC m=+5858.786548389" lastFinishedPulling="2025-12-02 19:52:42.021793149 +0000 UTC m=+5871.711412020" observedRunningTime="2025-12-02 19:52:42.632869203 +0000 UTC m=+5872.322488084" watchObservedRunningTime="2025-12-02 19:52:42.641553764 +0000 UTC m=+5872.331172645" Dec 02 19:53:33 crc kubenswrapper[4878]: I1202 19:53:33.175754 4878 generic.go:334] "Generic (PLEG): container finished" podID="b1d55bf1-b176-4c33-9ad7-7ece51f3a789" containerID="a69741bea84b499ffbb9d201067cb0fffbb85e936d4d3bb5b995d140309f523a" exitCode=0 Dec 02 19:53:33 crc kubenswrapper[4878]: I1202 19:53:33.177427 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" event={"ID":"b1d55bf1-b176-4c33-9ad7-7ece51f3a789","Type":"ContainerDied","Data":"a69741bea84b499ffbb9d201067cb0fffbb85e936d4d3bb5b995d140309f523a"} Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.337005 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.380877 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-k8tzq"] Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.394529 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-k8tzq"] Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.459524 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x745g\" (UniqueName: \"kubernetes.io/projected/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-kube-api-access-x745g\") pod \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.459927 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-host\") pod \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\" (UID: \"b1d55bf1-b176-4c33-9ad7-7ece51f3a789\") " Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.460145 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-host" (OuterVolumeSpecName: "host") pod "b1d55bf1-b176-4c33-9ad7-7ece51f3a789" (UID: "b1d55bf1-b176-4c33-9ad7-7ece51f3a789"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.460999 4878 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-host\") on node \"crc\" DevicePath \"\"" Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.464775 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-kube-api-access-x745g" (OuterVolumeSpecName: "kube-api-access-x745g") pod "b1d55bf1-b176-4c33-9ad7-7ece51f3a789" (UID: "b1d55bf1-b176-4c33-9ad7-7ece51f3a789"). InnerVolumeSpecName "kube-api-access-x745g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.563648 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x745g\" (UniqueName: \"kubernetes.io/projected/b1d55bf1-b176-4c33-9ad7-7ece51f3a789-kube-api-access-x745g\") on node \"crc\" DevicePath \"\"" Dec 02 19:53:34 crc kubenswrapper[4878]: I1202 19:53:34.952548 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d55bf1-b176-4c33-9ad7-7ece51f3a789" path="/var/lib/kubelet/pods/b1d55bf1-b176-4c33-9ad7-7ece51f3a789/volumes" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.205873 4878 scope.go:117] "RemoveContainer" containerID="a69741bea84b499ffbb9d201067cb0fffbb85e936d4d3bb5b995d140309f523a" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.205909 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-k8tzq" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.568902 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-kr92f"] Dec 02 19:53:35 crc kubenswrapper[4878]: E1202 19:53:35.569417 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d55bf1-b176-4c33-9ad7-7ece51f3a789" containerName="container-00" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.569431 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d55bf1-b176-4c33-9ad7-7ece51f3a789" containerName="container-00" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.569641 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d55bf1-b176-4c33-9ad7-7ece51f3a789" containerName="container-00" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.570369 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.688594 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e298f16-5d89-4064-9ef5-40fcc67d977f-host\") pod \"crc-debug-kr92f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.689027 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-764gk\" (UniqueName: \"kubernetes.io/projected/8e298f16-5d89-4064-9ef5-40fcc67d977f-kube-api-access-764gk\") pod \"crc-debug-kr92f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.791166 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-764gk\" (UniqueName: \"kubernetes.io/projected/8e298f16-5d89-4064-9ef5-40fcc67d977f-kube-api-access-764gk\") pod \"crc-debug-kr92f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.791401 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e298f16-5d89-4064-9ef5-40fcc67d977f-host\") pod \"crc-debug-kr92f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.791599 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e298f16-5d89-4064-9ef5-40fcc67d977f-host\") pod \"crc-debug-kr92f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:35 crc kubenswrapper[4878]: I1202 19:53:35.973340 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-764gk\" (UniqueName: \"kubernetes.io/projected/8e298f16-5d89-4064-9ef5-40fcc67d977f-kube-api-access-764gk\") pod \"crc-debug-kr92f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:36 crc kubenswrapper[4878]: I1202 19:53:36.188777 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:37 crc kubenswrapper[4878]: I1202 19:53:37.227874 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-kr92f" event={"ID":"8e298f16-5d89-4064-9ef5-40fcc67d977f","Type":"ContainerStarted","Data":"16094d3f4e16ed4963d06956778a16ad0c661231744cfb58a8d89157d8e4ddb0"} Dec 02 19:53:38 crc kubenswrapper[4878]: I1202 19:53:38.246221 4878 generic.go:334] "Generic (PLEG): container finished" podID="8e298f16-5d89-4064-9ef5-40fcc67d977f" containerID="1cf202fb7efe6833ae1025b7d6a76d4839ad26a6e95a5f1c54502c511bcde11a" exitCode=0 Dec 02 19:53:38 crc kubenswrapper[4878]: I1202 19:53:38.246375 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-kr92f" event={"ID":"8e298f16-5d89-4064-9ef5-40fcc67d977f","Type":"ContainerDied","Data":"1cf202fb7efe6833ae1025b7d6a76d4839ad26a6e95a5f1c54502c511bcde11a"} Dec 02 19:53:39 crc kubenswrapper[4878]: I1202 19:53:39.387430 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:39 crc kubenswrapper[4878]: I1202 19:53:39.483488 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e298f16-5d89-4064-9ef5-40fcc67d977f-host\") pod \"8e298f16-5d89-4064-9ef5-40fcc67d977f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " Dec 02 19:53:39 crc kubenswrapper[4878]: I1202 19:53:39.483785 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-764gk\" (UniqueName: \"kubernetes.io/projected/8e298f16-5d89-4064-9ef5-40fcc67d977f-kube-api-access-764gk\") pod \"8e298f16-5d89-4064-9ef5-40fcc67d977f\" (UID: \"8e298f16-5d89-4064-9ef5-40fcc67d977f\") " Dec 02 19:53:39 crc kubenswrapper[4878]: I1202 19:53:39.483787 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e298f16-5d89-4064-9ef5-40fcc67d977f-host" (OuterVolumeSpecName: "host") pod "8e298f16-5d89-4064-9ef5-40fcc67d977f" (UID: "8e298f16-5d89-4064-9ef5-40fcc67d977f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 19:53:39 crc kubenswrapper[4878]: I1202 19:53:39.484542 4878 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e298f16-5d89-4064-9ef5-40fcc67d977f-host\") on node \"crc\" DevicePath \"\"" Dec 02 19:53:39 crc kubenswrapper[4878]: I1202 19:53:39.492033 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e298f16-5d89-4064-9ef5-40fcc67d977f-kube-api-access-764gk" (OuterVolumeSpecName: "kube-api-access-764gk") pod "8e298f16-5d89-4064-9ef5-40fcc67d977f" (UID: "8e298f16-5d89-4064-9ef5-40fcc67d977f"). InnerVolumeSpecName "kube-api-access-764gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:53:39 crc kubenswrapper[4878]: I1202 19:53:39.586309 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-764gk\" (UniqueName: \"kubernetes.io/projected/8e298f16-5d89-4064-9ef5-40fcc67d977f-kube-api-access-764gk\") on node \"crc\" DevicePath \"\"" Dec 02 19:53:40 crc kubenswrapper[4878]: I1202 19:53:40.275153 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-kr92f" event={"ID":"8e298f16-5d89-4064-9ef5-40fcc67d977f","Type":"ContainerDied","Data":"16094d3f4e16ed4963d06956778a16ad0c661231744cfb58a8d89157d8e4ddb0"} Dec 02 19:53:40 crc kubenswrapper[4878]: I1202 19:53:40.275479 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16094d3f4e16ed4963d06956778a16ad0c661231744cfb58a8d89157d8e4ddb0" Dec 02 19:53:40 crc kubenswrapper[4878]: I1202 19:53:40.275537 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-kr92f" Dec 02 19:53:40 crc kubenswrapper[4878]: E1202 19:53:40.461528 4878 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e298f16_5d89_4064_9ef5_40fcc67d977f.slice/crio-16094d3f4e16ed4963d06956778a16ad0c661231744cfb58a8d89157d8e4ddb0\": RecentStats: unable to find data in memory cache]" Dec 02 19:53:40 crc kubenswrapper[4878]: I1202 19:53:40.709679 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-kr92f"] Dec 02 19:53:40 crc kubenswrapper[4878]: I1202 19:53:40.721643 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-kr92f"] Dec 02 19:53:40 crc kubenswrapper[4878]: I1202 19:53:40.952097 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e298f16-5d89-4064-9ef5-40fcc67d977f" path="/var/lib/kubelet/pods/8e298f16-5d89-4064-9ef5-40fcc67d977f/volumes" Dec 02 19:53:41 crc kubenswrapper[4878]: I1202 19:53:41.968566 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-754rf"] Dec 02 19:53:41 crc kubenswrapper[4878]: E1202 19:53:41.970349 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e298f16-5d89-4064-9ef5-40fcc67d977f" containerName="container-00" Dec 02 19:53:41 crc kubenswrapper[4878]: I1202 19:53:41.970444 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e298f16-5d89-4064-9ef5-40fcc67d977f" containerName="container-00" Dec 02 19:53:41 crc kubenswrapper[4878]: I1202 19:53:41.970812 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e298f16-5d89-4064-9ef5-40fcc67d977f" containerName="container-00" Dec 02 19:53:41 crc kubenswrapper[4878]: I1202 19:53:41.971741 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:42 crc kubenswrapper[4878]: I1202 19:53:42.044475 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-host\") pod \"crc-debug-754rf\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:42 crc kubenswrapper[4878]: I1202 19:53:42.044751 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkp8s\" (UniqueName: \"kubernetes.io/projected/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-kube-api-access-wkp8s\") pod \"crc-debug-754rf\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:42 crc kubenswrapper[4878]: I1202 19:53:42.147103 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkp8s\" (UniqueName: \"kubernetes.io/projected/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-kube-api-access-wkp8s\") pod \"crc-debug-754rf\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:42 crc kubenswrapper[4878]: I1202 19:53:42.147289 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-host\") pod \"crc-debug-754rf\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:42 crc kubenswrapper[4878]: I1202 19:53:42.147487 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-host\") pod \"crc-debug-754rf\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:42 crc kubenswrapper[4878]: I1202 19:53:42.165707 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkp8s\" (UniqueName: \"kubernetes.io/projected/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-kube-api-access-wkp8s\") pod \"crc-debug-754rf\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:42 crc kubenswrapper[4878]: I1202 19:53:42.293406 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:43 crc kubenswrapper[4878]: I1202 19:53:43.314799 4878 generic.go:334] "Generic (PLEG): container finished" podID="173fca40-e2b4-41cc-a4b9-3e0b2afa1a78" containerID="2ac33d94a86c13f3aea4a5b67dc91291c253f8328a9947d9786d60ea964458bc" exitCode=0 Dec 02 19:53:43 crc kubenswrapper[4878]: I1202 19:53:43.314889 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-754rf" event={"ID":"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78","Type":"ContainerDied","Data":"2ac33d94a86c13f3aea4a5b67dc91291c253f8328a9947d9786d60ea964458bc"} Dec 02 19:53:43 crc kubenswrapper[4878]: I1202 19:53:43.315297 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/crc-debug-754rf" event={"ID":"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78","Type":"ContainerStarted","Data":"a9f69a8622380f438013bfed84393748d7a9374befde93824060fee7e6eb70e6"} Dec 02 19:53:43 crc kubenswrapper[4878]: I1202 19:53:43.374303 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-754rf"] Dec 02 19:53:43 crc kubenswrapper[4878]: I1202 19:53:43.384921 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qmbf/crc-debug-754rf"] Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.466414 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.610009 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-host\") pod \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.610068 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkp8s\" (UniqueName: \"kubernetes.io/projected/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-kube-api-access-wkp8s\") pod \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\" (UID: \"173fca40-e2b4-41cc-a4b9-3e0b2afa1a78\") " Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.610135 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-host" (OuterVolumeSpecName: "host") pod "173fca40-e2b4-41cc-a4b9-3e0b2afa1a78" (UID: "173fca40-e2b4-41cc-a4b9-3e0b2afa1a78"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.610689 4878 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-host\") on node \"crc\" DevicePath \"\"" Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.616453 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-kube-api-access-wkp8s" (OuterVolumeSpecName: "kube-api-access-wkp8s") pod "173fca40-e2b4-41cc-a4b9-3e0b2afa1a78" (UID: "173fca40-e2b4-41cc-a4b9-3e0b2afa1a78"). InnerVolumeSpecName "kube-api-access-wkp8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.712400 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkp8s\" (UniqueName: \"kubernetes.io/projected/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78-kube-api-access-wkp8s\") on node \"crc\" DevicePath \"\"" Dec 02 19:53:44 crc kubenswrapper[4878]: I1202 19:53:44.950227 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173fca40-e2b4-41cc-a4b9-3e0b2afa1a78" path="/var/lib/kubelet/pods/173fca40-e2b4-41cc-a4b9-3e0b2afa1a78/volumes" Dec 02 19:53:45 crc kubenswrapper[4878]: I1202 19:53:45.346102 4878 scope.go:117] "RemoveContainer" containerID="2ac33d94a86c13f3aea4a5b67dc91291c253f8328a9947d9786d60ea964458bc" Dec 02 19:53:45 crc kubenswrapper[4878]: I1202 19:53:45.346135 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/crc-debug-754rf" Dec 02 19:53:53 crc kubenswrapper[4878]: I1202 19:53:53.742470 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:53:53 crc kubenswrapper[4878]: I1202 19:53:53.743204 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:54:10 crc kubenswrapper[4878]: I1202 19:54:10.515076 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-api/0.log" Dec 02 19:54:10 crc kubenswrapper[4878]: I1202 19:54:10.570957 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-evaluator/0.log" Dec 02 19:54:10 crc kubenswrapper[4878]: I1202 19:54:10.754480 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-listener/0.log" Dec 02 19:54:10 crc kubenswrapper[4878]: I1202 19:54:10.756881 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-notifier/0.log" Dec 02 19:54:10 crc kubenswrapper[4878]: I1202 19:54:10.818855 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58bf7d9584-2nldp_e21a97ae-d28a-4c3c-b669-bd186e06a311/barbican-api/0.log" Dec 02 19:54:10 crc kubenswrapper[4878]: I1202 19:54:10.939426 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58bf7d9584-2nldp_e21a97ae-d28a-4c3c-b669-bd186e06a311/barbican-api-log/0.log" Dec 02 19:54:10 crc kubenswrapper[4878]: I1202 19:54:10.999920 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69bfdb774b-284fr_e4828d24-fa12-4cf6-9e5b-8864d62c8536/barbican-keystone-listener/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.126263 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69bfdb774b-284fr_e4828d24-fa12-4cf6-9e5b-8864d62c8536/barbican-keystone-listener-log/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.247634 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54c9cb88c9-bzlhb_c992f2a3-c18e-470d-b4dc-168a9dcd8528/barbican-worker/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.255658 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54c9cb88c9-bzlhb_c992f2a3-c18e-470d-b4dc-168a9dcd8528/barbican-worker-log/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.464439 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/ceilometer-central-agent/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.485604 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj_52939763-97c2-42f6-9aa4-56e99153e87f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.771095 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/ceilometer-notification-agent/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.914893 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/proxy-httpd/0.log" Dec 02 19:54:11 crc kubenswrapper[4878]: I1202 19:54:11.943696 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/sg-core/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.075198 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60d7f991-50e6-47fa-8b4b-137022c03671/cinder-api/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.132556 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60d7f991-50e6-47fa-8b4b-137022c03671/cinder-api-log/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.242916 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5cea7d1e-f0d6-4a27-9840-1ce77743b26d/cinder-scheduler/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.316742 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5cea7d1e-f0d6-4a27-9840-1ce77743b26d/probe/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.439557 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cf87z_2f8b1f89-0eef-426b-8bb0-8700c42ede2e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.531085 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-t7g94_8714a5c4-b7c2-4e8a-a112-2530648da63b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.762580 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d594564dc-vbhmw_66af733a-2f8b-4127-9c9c-00d137d8eb4e/init/0.log" Dec 02 19:54:12 crc kubenswrapper[4878]: I1202 19:54:12.978915 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d594564dc-vbhmw_66af733a-2f8b-4127-9c9c-00d137d8eb4e/init/0.log" Dec 02 19:54:13 crc kubenswrapper[4878]: I1202 19:54:13.029326 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d594564dc-vbhmw_66af733a-2f8b-4127-9c9c-00d137d8eb4e/dnsmasq-dns/0.log" Dec 02 19:54:13 crc kubenswrapper[4878]: I1202 19:54:13.080804 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-l59tj_7b2e71d7-3e0b-4bfa-835c-374dcf03dd86/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:13 crc kubenswrapper[4878]: I1202 19:54:13.244548 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dc42b137-3b4d-4673-8f42-e1fd55534c16/glance-httpd/0.log" Dec 02 19:54:13 crc kubenswrapper[4878]: I1202 19:54:13.350177 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dc42b137-3b4d-4673-8f42-e1fd55534c16/glance-log/0.log" Dec 02 19:54:13 crc kubenswrapper[4878]: I1202 19:54:13.526601 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_59f6212e-501f-4a58-8d24-8f79f95dc992/glance-httpd/0.log" Dec 02 19:54:13 crc kubenswrapper[4878]: I1202 19:54:13.539735 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_59f6212e-501f-4a58-8d24-8f79f95dc992/glance-log/0.log" Dec 02 19:54:14 crc kubenswrapper[4878]: I1202 19:54:14.136720 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-757798f686-n6x4p_a845b7d1-1f82-4048-8bc9-56611020bcec/heat-engine/0.log" Dec 02 19:54:14 crc kubenswrapper[4878]: I1202 19:54:14.442713 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj_7ca1da86-eacb-4ac4-a155-62de0292cbdf/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:14 crc kubenswrapper[4878]: I1202 19:54:14.484150 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7546bcc475-2zz2q_a37933a9-8ddf-406e-9f40-b79fba21d5b5/heat-cfnapi/0.log" Dec 02 19:54:14 crc kubenswrapper[4878]: I1202 19:54:14.502128 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-f8bcd7564-vh2kf_88be3860-e9da-4f2b-baff-142f994127c4/heat-api/0.log" Dec 02 19:54:14 crc kubenswrapper[4878]: I1202 19:54:14.688261 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jh9ts_0112601b-e2a2-4547-bea0-5afad959f726/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:15 crc kubenswrapper[4878]: I1202 19:54:15.113995 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-67bf9d8f54-s7vnk_917539f3-4a78-4c46-a2e3-0b95342fe994/keystone-api/0.log" Dec 02 19:54:15 crc kubenswrapper[4878]: I1202 19:54:15.168504 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411701-sf57f_08153e97-46ad-4405-b0ea-7f4606a82c6f/keystone-cron/0.log" Dec 02 19:54:15 crc kubenswrapper[4878]: I1202 19:54:15.182018 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_763df430-6f7f-4642-9452-1fcc5d47d283/kube-state-metrics/0.log" Dec 02 19:54:15 crc kubenswrapper[4878]: I1202 19:54:15.380795 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5_0b199858-7108-4b94-b3f9-692a11430c94/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:15 crc kubenswrapper[4878]: I1202 19:54:15.418275 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-vqdv9_a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:15 crc kubenswrapper[4878]: I1202 19:54:15.700568 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_1bbb9528-81ba-487f-bf86-44276e8ac969/mysqld-exporter/0.log" Dec 02 19:54:16 crc kubenswrapper[4878]: I1202 19:54:16.079346 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v_14ba7638-79c8-4806-9bc5-8b8c7ab029c3/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:16 crc kubenswrapper[4878]: I1202 19:54:16.158891 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9d785686f-gqxnz_70e70d85-cdcd-43e7-b2c2-dbc6386665e3/neutron-api/0.log" Dec 02 19:54:16 crc kubenswrapper[4878]: I1202 19:54:16.173914 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9d785686f-gqxnz_70e70d85-cdcd-43e7-b2c2-dbc6386665e3/neutron-httpd/0.log" Dec 02 19:54:16 crc kubenswrapper[4878]: I1202 19:54:16.832712 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b5108475-8a24-4bce-a285-f4a26785d6f9/nova-cell0-conductor-conductor/0.log" Dec 02 19:54:17 crc kubenswrapper[4878]: I1202 19:54:17.061193 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_49118006-cedb-4f41-a752-c635108e2bf7/nova-cell1-conductor-conductor/0.log" Dec 02 19:54:17 crc kubenswrapper[4878]: I1202 19:54:17.180429 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4cfd0dfa-637d-432b-946d-753c5afa72dd/nova-api-log/0.log" Dec 02 19:54:17 crc kubenswrapper[4878]: I1202 19:54:17.482672 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-csjn8_53bb65b6-2ee5-42dd-8a1d-df8a04008975/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:17 crc kubenswrapper[4878]: I1202 19:54:17.494187 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_855f3d94-a64e-4661-8a2d-30b33a682633/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 19:54:17 crc kubenswrapper[4878]: I1202 19:54:17.526988 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4cfd0dfa-637d-432b-946d-753c5afa72dd/nova-api-api/0.log" Dec 02 19:54:17 crc kubenswrapper[4878]: I1202 19:54:17.829741 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5217962f-8411-4be7-bbd0-93858938b746/nova-metadata-log/0.log" Dec 02 19:54:18 crc kubenswrapper[4878]: I1202 19:54:18.097328 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5dc0fa21-804e-42bf-a190-4e108c84df48/nova-scheduler-scheduler/0.log" Dec 02 19:54:18 crc kubenswrapper[4878]: I1202 19:54:18.115601 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9cae31f5-acb4-423b-8a14-4136afb73062/mysql-bootstrap/0.log" Dec 02 19:54:18 crc kubenswrapper[4878]: I1202 19:54:18.310702 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9cae31f5-acb4-423b-8a14-4136afb73062/mysql-bootstrap/0.log" Dec 02 19:54:18 crc kubenswrapper[4878]: I1202 19:54:18.318832 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9cae31f5-acb4-423b-8a14-4136afb73062/galera/0.log" Dec 02 19:54:18 crc kubenswrapper[4878]: I1202 19:54:18.543121 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c436c198-1049-416f-9ab7-33261ff55ab4/mysql-bootstrap/0.log" Dec 02 19:54:18 crc kubenswrapper[4878]: I1202 19:54:18.999761 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c436c198-1049-416f-9ab7-33261ff55ab4/mysql-bootstrap/0.log" Dec 02 19:54:19 crc kubenswrapper[4878]: I1202 19:54:19.171480 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c436c198-1049-416f-9ab7-33261ff55ab4/galera/0.log" Dec 02 19:54:19 crc kubenswrapper[4878]: I1202 19:54:19.350309 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5caa14c8-5110-4246-a7d4-75ef3c6d5d00/openstackclient/0.log" Dec 02 19:54:19 crc kubenswrapper[4878]: I1202 19:54:19.560466 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2pqpm_f833875c-c0f5-4654-b592-14d4a6161df6/openstack-network-exporter/0.log" Dec 02 19:54:19 crc kubenswrapper[4878]: I1202 19:54:19.803147 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovsdb-server-init/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.067610 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovsdb-server/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.081123 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovs-vswitchd/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.102468 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovsdb-server-init/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.143643 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5217962f-8411-4be7-bbd0-93858938b746/nova-metadata-metadata/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.286564 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qgmlw_1f049ebe-547b-40a2-8468-932cfc5051ea/ovn-controller/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.399801 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sm6fm_49eb68a8-72ac-4fb7-ab11-7e89f85e7f22/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.613604 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af53983f-772c-431c-95a1-af6b3d3c0edf/openstack-network-exporter/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.660385 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af53983f-772c-431c-95a1-af6b3d3c0edf/ovn-northd/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.820795 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a6aad750-71cc-4815-906a-5f2a130875e8/ovsdbserver-nb/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.842275 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a6aad750-71cc-4815-906a-5f2a130875e8/openstack-network-exporter/0.log" Dec 02 19:54:20 crc kubenswrapper[4878]: I1202 19:54:20.910209 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4/openstack-network-exporter/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.113666 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4/ovsdbserver-sb/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.256274 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd45c784-zz4hz_5d63745b-034f-4f6f-b2f7-abeca299930b/placement-api/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.312690 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd45c784-zz4hz_5d63745b-034f-4f6f-b2f7-abeca299930b/placement-log/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.405546 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/init-config-reloader/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.512662 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/config-reloader/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.518282 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/init-config-reloader/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.658017 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/prometheus/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.695402 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/thanos-sidecar/0.log" Dec 02 19:54:21 crc kubenswrapper[4878]: I1202 19:54:21.760634 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eedb789b-6bed-4a82-82c1-977a633ed304/setup-container/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.180058 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eedb789b-6bed-4a82-82c1-977a633ed304/rabbitmq/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.255998 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eedb789b-6bed-4a82-82c1-977a633ed304/setup-container/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.350068 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da/setup-container/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.528333 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da/setup-container/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.658290 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n_f1a288af-a20b-4e48-a331-561e16e01989/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.669971 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da/rabbitmq/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.847651 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zgm9n_4c58063d-9ff4-43dd-9dec-17d14a541013/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:22 crc kubenswrapper[4878]: I1202 19:54:22.941713 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8_8097a01b-4fab-4bac-839d-a1f937120beb/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.127333 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zstf2_e32e5051-b0ff-4fee-9268-266c4cc38c68/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.224289 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t4tn6_8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b/ssh-known-hosts-edpm-deployment/0.log" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.452531 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54fd7cfcc9-x4n56_c033144d-0cad-47bd-87b6-3715278cf5c1/proxy-server/0.log" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.611894 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rfcql_0e84c0f0-1e32-4c9b-b21d-f49bb06863fc/swift-ring-rebalance/0.log" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.683407 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54fd7cfcc9-x4n56_c033144d-0cad-47bd-87b6-3715278cf5c1/proxy-httpd/0.log" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.730392 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-auditor/0.log" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.742510 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.742569 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:54:23 crc kubenswrapper[4878]: I1202 19:54:23.864087 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-reaper/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.000007 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-auditor/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.053850 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-replicator/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.055411 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-server/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.154747 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-replicator/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.291473 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-server/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.335015 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-updater/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.338434 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-auditor/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.434866 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-expirer/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.567339 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-replicator/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.599045 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-updater/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.599388 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-server/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.676947 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/rsync/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.815887 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/swift-recon-cron/0.log" Dec 02 19:54:24 crc kubenswrapper[4878]: I1202 19:54:24.864677 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq_1d2ce001-5523-44c3-b911-47c3f44ffb77/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:25 crc kubenswrapper[4878]: I1202 19:54:25.066049 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt_58ff5ba7-481c-48d8-bf39-0eb6665f23d7/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:25 crc kubenswrapper[4878]: I1202 19:54:25.387950 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_18ccf713-5eef-4b4b-b1ea-9f3b34639edb/test-operator-logs-container/0.log" Dec 02 19:54:25 crc kubenswrapper[4878]: I1202 19:54:25.534565 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-46pqj_ad5d3ca4-7255-4bd3-9976-0834ea7b94ee/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 19:54:26 crc kubenswrapper[4878]: I1202 19:54:26.009677 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_427bac6f-5bf8-4f40-a0f6-fea0cede315f/memcached/0.log" Dec 02 19:54:26 crc kubenswrapper[4878]: I1202 19:54:26.225908 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e5a4096d-c2be-4987-b0eb-fb47da8a9703/tempest-tests-tempest-tests-runner/0.log" Dec 02 19:54:52 crc kubenswrapper[4878]: I1202 19:54:52.518787 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/util/0.log" Dec 02 19:54:52 crc kubenswrapper[4878]: I1202 19:54:52.662948 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/util/0.log" Dec 02 19:54:52 crc kubenswrapper[4878]: I1202 19:54:52.731051 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/pull/0.log" Dec 02 19:54:52 crc kubenswrapper[4878]: I1202 19:54:52.735776 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/pull/0.log" Dec 02 19:54:52 crc kubenswrapper[4878]: I1202 19:54:52.964992 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/util/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.036428 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/extract/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.056609 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/pull/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.199711 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgj5c_81eba8a0-84f6-4456-9484-dfa84dda8e10/kube-rbac-proxy/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.281890 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgj5c_81eba8a0-84f6-4456-9484-dfa84dda8e10/manager/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.346957 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-ds92l_0e516f0b-2b62-4d60-b1bd-07404ffcdea9/kube-rbac-proxy/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.469801 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-ds92l_0e516f0b-2b62-4d60-b1bd-07404ffcdea9/manager/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.536437 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-kwqck_77e2f2de-8d3f-437b-8f32-7b76ea70ccda/kube-rbac-proxy/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.570226 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-kwqck_77e2f2de-8d3f-437b-8f32-7b76ea70ccda/manager/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.742011 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.742421 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.742580 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.743747 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.743918 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" gracePeriod=600 Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.757975 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4gsxd_fc38a188-1850-41eb-a958-fd1fe01270c7/kube-rbac-proxy/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.811903 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4gsxd_fc38a188-1850-41eb-a958-fd1fe01270c7/manager/0.log" Dec 02 19:54:53 crc kubenswrapper[4878]: E1202 19:54:53.862396 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:54:53 crc kubenswrapper[4878]: I1202 19:54:53.995505 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-fx64f_a128e6b1-604f-4d2d-9b31-1567ade115df/kube-rbac-proxy/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.060992 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-fx64f_a128e6b1-604f-4d2d-9b31-1567ade115df/manager/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.083969 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vc67f_d7143587-e348-48b5-9164-a4d477b4a259/kube-rbac-proxy/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.123433 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" exitCode=0 Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.123481 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25"} Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.123514 4878 scope.go:117] "RemoveContainer" containerID="fb932a3ddd875b5f5a1d387638664611ad7b68fb4301c89bc7ca42c4dbff3eda" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.123940 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:54:54 crc kubenswrapper[4878]: E1202 19:54:54.124194 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.306870 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vc67f_d7143587-e348-48b5-9164-a4d477b4a259/manager/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.328318 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-cmzpg_3028ad1d-cba5-4197-964f-6405fb1cc1c3/kube-rbac-proxy/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.555002 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-cmzpg_3028ad1d-cba5-4197-964f-6405fb1cc1c3/manager/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.566589 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-rqzqn_7dfef70a-0da9-4ad6-9fda-1cac674c9ddb/kube-rbac-proxy/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.635342 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-rqzqn_7dfef70a-0da9-4ad6-9fda-1cac674c9ddb/manager/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.852733 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-kmkzd_e51c89b2-70ae-4c1d-81b0-8aba6e211dd0/kube-rbac-proxy/0.log" Dec 02 19:54:54 crc kubenswrapper[4878]: I1202 19:54:54.890001 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-kmkzd_e51c89b2-70ae-4c1d-81b0-8aba6e211dd0/manager/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.011821 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4c7fq_5cee25c6-1e94-400c-afd8-c1e75f31e619/manager/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.020561 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4c7fq_5cee25c6-1e94-400c-afd8-c1e75f31e619/kube-rbac-proxy/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.135363 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-952z9_492dff42-bb87-4c30-8f81-02406308904c/kube-rbac-proxy/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.227606 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-952z9_492dff42-bb87-4c30-8f81-02406308904c/manager/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.345720 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-f58x6_ed599489-c1f6-440f-aaf3-339f424cbcdf/kube-rbac-proxy/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.416458 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-f58x6_ed599489-c1f6-440f-aaf3-339f424cbcdf/manager/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.477639 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xzpbx_84d9833f-9760-47ea-ba43-f385b24a3e57/kube-rbac-proxy/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.644671 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xzpbx_84d9833f-9760-47ea-ba43-f385b24a3e57/manager/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.719719 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9stj8_46cfeb81-3e48-4f16-ae55-aabe49810afb/manager/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.751764 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9stj8_46cfeb81-3e48-4f16-ae55-aabe49810afb/kube-rbac-proxy/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.843997 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk_095b515f-0144-4f7b-b3ab-9ca3440921db/kube-rbac-proxy/0.log" Dec 02 19:54:55 crc kubenswrapper[4878]: I1202 19:54:55.976436 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk_095b515f-0144-4f7b-b3ab-9ca3440921db/manager/0.log" Dec 02 19:54:56 crc kubenswrapper[4878]: I1202 19:54:56.510301 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dfbvm_73939207-5e4c-4ef0-ba00-efa6b403e4c7/registry-server/0.log" Dec 02 19:54:56 crc kubenswrapper[4878]: I1202 19:54:56.624057 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6db445db9f-m4wkz_ce506db4-fc2d-45a6-b9c1-23d22cc536cc/operator/0.log" Dec 02 19:54:56 crc kubenswrapper[4878]: I1202 19:54:56.720527 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-w99w9_56f1dd36-8cc9-4026-8976-8816940217a4/kube-rbac-proxy/0.log" Dec 02 19:54:56 crc kubenswrapper[4878]: I1202 19:54:56.971623 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-w99w9_56f1dd36-8cc9-4026-8976-8816940217a4/manager/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.027747 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-6qznx_98571e6d-dae6-4e83-8d08-e44e8609188f/manager/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.035835 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-6qznx_98571e6d-dae6-4e83-8d08-e44e8609188f/kube-rbac-proxy/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.280436 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-24n6c_5691d664-31ad-44c8-ab51-11bcf8f9d4c2/operator/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.296980 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-bb64db99c-xtzmk_e18472ba-dc06-4e34-99a7-974d9af72c0a/manager/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.327401 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j4dz5_5a683e44-012a-41ec-98db-36bcd5646959/kube-rbac-proxy/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.370535 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j4dz5_5a683e44-012a-41ec-98db-36bcd5646959/manager/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.510950 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-546f978c55-mvlfw_44a363d4-d9a3-44df-8a8f-902cb14a0443/kube-rbac-proxy/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.598535 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fhrfj_572eebdd-2dc7-4327-a339-4f92e3971d59/kube-rbac-proxy/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.753497 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fhrfj_572eebdd-2dc7-4327-a339-4f92e3971d59/manager/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.815162 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-546f978c55-mvlfw_44a363d4-d9a3-44df-8a8f-902cb14a0443/manager/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.839038 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-hndnh_bfa1dc17-f042-46f8-8bc8-3f8d9e135073/kube-rbac-proxy/0.log" Dec 02 19:54:57 crc kubenswrapper[4878]: I1202 19:54:57.898955 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-hndnh_bfa1dc17-f042-46f8-8bc8-3f8d9e135073/manager/0.log" Dec 02 19:55:07 crc kubenswrapper[4878]: I1202 19:55:07.940957 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:55:07 crc kubenswrapper[4878]: E1202 19:55:07.941788 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:55:18 crc kubenswrapper[4878]: I1202 19:55:18.578534 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-b6fbw_a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b/control-plane-machine-set-operator/0.log" Dec 02 19:55:18 crc kubenswrapper[4878]: I1202 19:55:18.782776 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ngj62_ea820fcf-7d34-4381-bafa-cbc53d3f7c86/kube-rbac-proxy/0.log" Dec 02 19:55:18 crc kubenswrapper[4878]: I1202 19:55:18.824951 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ngj62_ea820fcf-7d34-4381-bafa-cbc53d3f7c86/machine-api-operator/0.log" Dec 02 19:55:22 crc kubenswrapper[4878]: I1202 19:55:22.938420 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:55:22 crc kubenswrapper[4878]: E1202 19:55:22.939465 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:55:31 crc kubenswrapper[4878]: I1202 19:55:31.671050 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7mgrc_4311fcfc-1cf4-4bab-b946-40efef5b8c10/cert-manager-controller/0.log" Dec 02 19:55:31 crc kubenswrapper[4878]: I1202 19:55:31.859207 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xhjz5_c5196963-9d92-4f0a-ab8c-47f4b86a685f/cert-manager-cainjector/0.log" Dec 02 19:55:31 crc kubenswrapper[4878]: I1202 19:55:31.902209 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-qfhjc_bb76342f-0435-4589-826f-3a7cee8cc419/cert-manager-webhook/0.log" Dec 02 19:55:33 crc kubenswrapper[4878]: I1202 19:55:33.939206 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:55:33 crc kubenswrapper[4878]: E1202 19:55:33.941867 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:55:45 crc kubenswrapper[4878]: I1202 19:55:45.811213 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-88jcq_91633707-7b72-4b18-a516-a6b327dc44f1/nmstate-console-plugin/0.log" Dec 02 19:55:46 crc kubenswrapper[4878]: I1202 19:55:46.190093 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mzs4g_788e4f72-26a7-455f-b805-32b1b519726c/kube-rbac-proxy/0.log" Dec 02 19:55:46 crc kubenswrapper[4878]: I1202 19:55:46.194972 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-v45r2_5a7dc067-d2fd-4145-b79c-33fac3675cdd/nmstate-handler/0.log" Dec 02 19:55:46 crc kubenswrapper[4878]: I1202 19:55:46.288680 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mzs4g_788e4f72-26a7-455f-b805-32b1b519726c/nmstate-metrics/0.log" Dec 02 19:55:46 crc kubenswrapper[4878]: I1202 19:55:46.426779 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-67sqq_e9a04710-7320-4cde-9863-e05e65f54671/nmstate-operator/0.log" Dec 02 19:55:46 crc kubenswrapper[4878]: I1202 19:55:46.477135 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-h4wxh_b86218c7-3d62-4631-8f95-e70b1f304615/nmstate-webhook/0.log" Dec 02 19:55:46 crc kubenswrapper[4878]: I1202 19:55:46.938319 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:55:46 crc kubenswrapper[4878]: E1202 19:55:46.938936 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:55:58 crc kubenswrapper[4878]: I1202 19:55:58.941961 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:55:58 crc kubenswrapper[4878]: E1202 19:55:58.943011 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:56:00 crc kubenswrapper[4878]: I1202 19:56:00.275863 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/kube-rbac-proxy/0.log" Dec 02 19:56:00 crc kubenswrapper[4878]: I1202 19:56:00.406158 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/manager/0.log" Dec 02 19:56:11 crc kubenswrapper[4878]: I1202 19:56:11.938963 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:56:11 crc kubenswrapper[4878]: E1202 19:56:11.940177 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:56:16 crc kubenswrapper[4878]: I1202 19:56:16.467323 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-wpfmg_056447cf-55f2-4ddd-bf36-7f0f637f5ca5/cluster-logging-operator/0.log" Dec 02 19:56:16 crc kubenswrapper[4878]: I1202 19:56:16.667217 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-jjcjt_089919f9-be76-4956-a7d3-92fa66aa19ef/collector/0.log" Dec 02 19:56:16 crc kubenswrapper[4878]: I1202 19:56:16.695702 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a/loki-compactor/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.137099 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-tl222_7f86bb98-b2df-4776-97a4-7a45b69972b8/loki-distributor/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.180478 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-89mkg_6bb468bd-bb35-4a4e-b41c-ed2a8f964d77/gateway/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.265708 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-89mkg_6bb468bd-bb35-4a4e-b41c-ed2a8f964d77/opa/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.372646 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-zjfmm_92b9d963-d156-45ca-89fd-3f992d10d24e/opa/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.390666 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-zjfmm_92b9d963-d156-45ca-89fd-3f992d10d24e/gateway/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.557470 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_ef6cf23f-200e-43b7-81ea-b13382391ad0/loki-index-gateway/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.670139 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_1a9f06ce-f976-42a6-9393-55ac1c7ca894/loki-ingester/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.800031 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-2d5dl_b1de2d7e-c37c-4464-bd35-337650bd62bf/loki-querier/0.log" Dec 02 19:56:17 crc kubenswrapper[4878]: I1202 19:56:17.875499 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-xwgzp_316a18ce-1717-4e23-8750-17b4ec2e553c/loki-query-frontend/0.log" Dec 02 19:56:25 crc kubenswrapper[4878]: I1202 19:56:25.938217 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:56:25 crc kubenswrapper[4878]: E1202 19:56:25.939341 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:56:32 crc kubenswrapper[4878]: I1202 19:56:32.675830 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jmg7l_f49a3efc-d73c-4b26-b668-8abf574eb6a9/kube-rbac-proxy/0.log" Dec 02 19:56:32 crc kubenswrapper[4878]: I1202 19:56:32.894687 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jmg7l_f49a3efc-d73c-4b26-b668-8abf574eb6a9/controller/0.log" Dec 02 19:56:32 crc kubenswrapper[4878]: I1202 19:56:32.985711 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.315365 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.340834 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.354594 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.408615 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.597019 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.599297 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.616741 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.636664 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.871358 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.871864 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.882877 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 19:56:33 crc kubenswrapper[4878]: I1202 19:56:33.907638 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/controller/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.037598 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/frr-metrics/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.073027 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/kube-rbac-proxy/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.134387 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/kube-rbac-proxy-frr/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.272420 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/reloader/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.372936 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-p7f4t_bb4be508-3a9c-48d3-a995-124bf91a4128/frr-k8s-webhook-server/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.541679 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b867f79f6-k2zm6_cda42159-8d1b-460a-b92e-a02db29c88e9/manager/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.775026 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-fd8d8f689-k4m6l_239083bd-7b88-46e8-b5e6-b1fdb9abc032/webhook-server/0.log" Dec 02 19:56:34 crc kubenswrapper[4878]: I1202 19:56:34.874182 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q4nt_73bda1d1-d063-44f4-8b13-20af22c61540/kube-rbac-proxy/0.log" Dec 02 19:56:36 crc kubenswrapper[4878]: I1202 19:56:36.053283 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q4nt_73bda1d1-d063-44f4-8b13-20af22c61540/speaker/0.log" Dec 02 19:56:36 crc kubenswrapper[4878]: I1202 19:56:36.282251 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/frr/0.log" Dec 02 19:56:40 crc kubenswrapper[4878]: I1202 19:56:40.945131 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:56:40 crc kubenswrapper[4878]: E1202 19:56:40.946057 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.408587 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lrrh2"] Dec 02 19:56:45 crc kubenswrapper[4878]: E1202 19:56:45.409534 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173fca40-e2b4-41cc-a4b9-3e0b2afa1a78" containerName="container-00" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.409547 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="173fca40-e2b4-41cc-a4b9-3e0b2afa1a78" containerName="container-00" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.409845 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="173fca40-e2b4-41cc-a4b9-3e0b2afa1a78" containerName="container-00" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.411593 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.424763 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lrrh2"] Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.579106 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-utilities\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.579510 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-catalog-content\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.579668 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8g42\" (UniqueName: \"kubernetes.io/projected/0df355b1-ad96-47dc-b237-33b01ceb5ad3-kube-api-access-b8g42\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.681864 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8g42\" (UniqueName: \"kubernetes.io/projected/0df355b1-ad96-47dc-b237-33b01ceb5ad3-kube-api-access-b8g42\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.682096 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-utilities\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.682132 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-catalog-content\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.682910 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-catalog-content\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.682934 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-utilities\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.720804 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8g42\" (UniqueName: \"kubernetes.io/projected/0df355b1-ad96-47dc-b237-33b01ceb5ad3-kube-api-access-b8g42\") pod \"certified-operators-lrrh2\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:45 crc kubenswrapper[4878]: I1202 19:56:45.781229 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:46 crc kubenswrapper[4878]: I1202 19:56:46.586509 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lrrh2"] Dec 02 19:56:46 crc kubenswrapper[4878]: W1202 19:56:46.596374 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df355b1_ad96_47dc_b237_33b01ceb5ad3.slice/crio-40d8da8032a2828e56f366459544fce4f907fe951284454c8dac1538dda19c44 WatchSource:0}: Error finding container 40d8da8032a2828e56f366459544fce4f907fe951284454c8dac1538dda19c44: Status 404 returned error can't find the container with id 40d8da8032a2828e56f366459544fce4f907fe951284454c8dac1538dda19c44 Dec 02 19:56:47 crc kubenswrapper[4878]: I1202 19:56:47.418983 4878 generic.go:334] "Generic (PLEG): container finished" podID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerID="80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517" exitCode=0 Dec 02 19:56:47 crc kubenswrapper[4878]: I1202 19:56:47.419079 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrrh2" event={"ID":"0df355b1-ad96-47dc-b237-33b01ceb5ad3","Type":"ContainerDied","Data":"80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517"} Dec 02 19:56:47 crc kubenswrapper[4878]: I1202 19:56:47.419523 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrrh2" event={"ID":"0df355b1-ad96-47dc-b237-33b01ceb5ad3","Type":"ContainerStarted","Data":"40d8da8032a2828e56f366459544fce4f907fe951284454c8dac1538dda19c44"} Dec 02 19:56:47 crc kubenswrapper[4878]: I1202 19:56:47.422887 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 19:56:47 crc kubenswrapper[4878]: I1202 19:56:47.972277 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mc2ph"] Dec 02 19:56:47 crc kubenswrapper[4878]: I1202 19:56:47.974933 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.014573 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc2ph"] Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.138818 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgrqm\" (UniqueName: \"kubernetes.io/projected/60001ff0-557c-4234-b34d-b6850dc9ba70-kube-api-access-pgrqm\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.138872 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-catalog-content\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.138980 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-utilities\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.241771 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-utilities\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.242147 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgrqm\" (UniqueName: \"kubernetes.io/projected/60001ff0-557c-4234-b34d-b6850dc9ba70-kube-api-access-pgrqm\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.242206 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-catalog-content\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.242821 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-catalog-content\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.243096 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-utilities\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.259832 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgrqm\" (UniqueName: \"kubernetes.io/projected/60001ff0-557c-4234-b34d-b6850dc9ba70-kube-api-access-pgrqm\") pod \"community-operators-mc2ph\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.307689 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:48 crc kubenswrapper[4878]: I1202 19:56:48.952102 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc2ph"] Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.348751 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/util/0.log" Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.447855 4878 generic.go:334] "Generic (PLEG): container finished" podID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerID="bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae" exitCode=0 Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.447939 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrrh2" event={"ID":"0df355b1-ad96-47dc-b237-33b01ceb5ad3","Type":"ContainerDied","Data":"bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae"} Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.450840 4878 generic.go:334] "Generic (PLEG): container finished" podID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerID="fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7" exitCode=0 Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.450916 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc2ph" event={"ID":"60001ff0-557c-4234-b34d-b6850dc9ba70","Type":"ContainerDied","Data":"fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7"} Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.450976 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc2ph" event={"ID":"60001ff0-557c-4234-b34d-b6850dc9ba70","Type":"ContainerStarted","Data":"b645df1c36be3353f6f1f3b44e8d04082f09d4a311d77e7b861a9875b897db20"} Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.630065 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/util/0.log" Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.725101 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/pull/0.log" Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.727775 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/pull/0.log" Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.856303 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/pull/0.log" Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.914141 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/extract/0.log" Dec 02 19:56:49 crc kubenswrapper[4878]: I1202 19:56:49.976601 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/util/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.088671 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/util/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.301386 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/pull/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.305532 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/pull/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.461749 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrrh2" event={"ID":"0df355b1-ad96-47dc-b237-33b01ceb5ad3","Type":"ContainerStarted","Data":"b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0"} Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.463476 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc2ph" event={"ID":"60001ff0-557c-4234-b34d-b6850dc9ba70","Type":"ContainerStarted","Data":"27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51"} Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.484517 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/util/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.484635 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/util/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.493892 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lrrh2" podStartSLOduration=3.037149845 podStartE2EDuration="5.493870691s" podCreationTimestamp="2025-12-02 19:56:45 +0000 UTC" firstStartedPulling="2025-12-02 19:56:47.421715055 +0000 UTC m=+6117.111333936" lastFinishedPulling="2025-12-02 19:56:49.878435901 +0000 UTC m=+6119.568054782" observedRunningTime="2025-12-02 19:56:50.487192812 +0000 UTC m=+6120.176811693" watchObservedRunningTime="2025-12-02 19:56:50.493870691 +0000 UTC m=+6120.183489572" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.543428 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/extract/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.651587 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/pull/0.log" Dec 02 19:56:50 crc kubenswrapper[4878]: I1202 19:56:50.834772 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/util/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.026005 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/util/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.085153 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/pull/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.088842 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/pull/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.272276 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/extract/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.310460 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/util/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.339912 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/pull/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.455062 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/util/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.480100 4878 generic.go:334] "Generic (PLEG): container finished" podID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerID="27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51" exitCode=0 Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.480167 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc2ph" event={"ID":"60001ff0-557c-4234-b34d-b6850dc9ba70","Type":"ContainerDied","Data":"27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51"} Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.924745 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/util/0.log" Dec 02 19:56:51 crc kubenswrapper[4878]: I1202 19:56:51.970440 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/pull/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.012990 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/pull/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.217763 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/extract/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.249219 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/util/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.282099 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/pull/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.403901 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/util/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.492201 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc2ph" event={"ID":"60001ff0-557c-4234-b34d-b6850dc9ba70","Type":"ContainerStarted","Data":"df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78"} Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.514014 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mc2ph" podStartSLOduration=2.935311914 podStartE2EDuration="5.513998344s" podCreationTimestamp="2025-12-02 19:56:47 +0000 UTC" firstStartedPulling="2025-12-02 19:56:49.452486452 +0000 UTC m=+6119.142105343" lastFinishedPulling="2025-12-02 19:56:52.031172892 +0000 UTC m=+6121.720791773" observedRunningTime="2025-12-02 19:56:52.507193593 +0000 UTC m=+6122.196812484" watchObservedRunningTime="2025-12-02 19:56:52.513998344 +0000 UTC m=+6122.203617225" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.649429 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/pull/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.653480 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/pull/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.688805 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/util/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.964184 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/pull/0.log" Dec 02 19:56:52 crc kubenswrapper[4878]: I1202 19:56:52.992865 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/util/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.009370 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/extract/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.168424 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrrh2_0df355b1-ad96-47dc-b237-33b01ceb5ad3/extract-utilities/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.362692 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrrh2_0df355b1-ad96-47dc-b237-33b01ceb5ad3/extract-content/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.402118 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrrh2_0df355b1-ad96-47dc-b237-33b01ceb5ad3/extract-utilities/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.423791 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrrh2_0df355b1-ad96-47dc-b237-33b01ceb5ad3/extract-content/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.638124 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrrh2_0df355b1-ad96-47dc-b237-33b01ceb5ad3/extract-content/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.683556 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrrh2_0df355b1-ad96-47dc-b237-33b01ceb5ad3/extract-utilities/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.723282 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lrrh2_0df355b1-ad96-47dc-b237-33b01ceb5ad3/registry-server/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.843394 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-utilities/0.log" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.938367 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:56:53 crc kubenswrapper[4878]: E1202 19:56:53.938660 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:56:53 crc kubenswrapper[4878]: I1202 19:56:53.982479 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-content/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.013896 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-content/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.016113 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-utilities/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.228805 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-content/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.300766 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-utilities/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.310371 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-utilities/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.502844 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-utilities/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.591886 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-content/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.634161 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-content/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.900469 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-content/0.log" Dec 02 19:56:54 crc kubenswrapper[4878]: I1202 19:56:54.902063 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-utilities/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.115079 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mc2ph_60001ff0-557c-4234-b34d-b6850dc9ba70/extract-utilities/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.335306 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mc2ph_60001ff0-557c-4234-b34d-b6850dc9ba70/extract-utilities/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.353002 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mc2ph_60001ff0-557c-4234-b34d-b6850dc9ba70/extract-content/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.426647 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mc2ph_60001ff0-557c-4234-b34d-b6850dc9ba70/extract-content/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.576793 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/registry-server/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.608757 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mc2ph_60001ff0-557c-4234-b34d-b6850dc9ba70/extract-utilities/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.617381 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mc2ph_60001ff0-557c-4234-b34d-b6850dc9ba70/extract-content/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.664629 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mc2ph_60001ff0-557c-4234-b34d-b6850dc9ba70/registry-server/0.log" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.781750 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.782575 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:56:55 crc kubenswrapper[4878]: I1202 19:56:55.798695 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8xkxg_11c10f5a-0137-467d-a749-1bce1c6210ed/marketplace-operator/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.004201 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/registry-server/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.008192 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-utilities/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.205760 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-utilities/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.224266 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-content/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.225650 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-content/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.561329 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-utilities/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.576703 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-content/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.600409 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/registry-server/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.622557 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-utilities/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.806583 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-utilities/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.806644 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-content/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.821356 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-content/0.log" Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.837454 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lrrh2" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="registry-server" probeResult="failure" output=< Dec 02 19:56:56 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 19:56:56 crc kubenswrapper[4878]: > Dec 02 19:56:56 crc kubenswrapper[4878]: I1202 19:56:56.981219 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-utilities/0.log" Dec 02 19:56:57 crc kubenswrapper[4878]: I1202 19:56:57.006612 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-content/0.log" Dec 02 19:56:57 crc kubenswrapper[4878]: I1202 19:56:57.786190 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/registry-server/0.log" Dec 02 19:56:58 crc kubenswrapper[4878]: I1202 19:56:58.308071 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:58 crc kubenswrapper[4878]: I1202 19:56:58.308130 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:58 crc kubenswrapper[4878]: I1202 19:56:58.368113 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:58 crc kubenswrapper[4878]: I1202 19:56:58.628874 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:56:58 crc kubenswrapper[4878]: I1202 19:56:58.700132 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc2ph"] Dec 02 19:57:00 crc kubenswrapper[4878]: I1202 19:57:00.587921 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mc2ph" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="registry-server" containerID="cri-o://df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78" gracePeriod=2 Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.145122 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.250131 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgrqm\" (UniqueName: \"kubernetes.io/projected/60001ff0-557c-4234-b34d-b6850dc9ba70-kube-api-access-pgrqm\") pod \"60001ff0-557c-4234-b34d-b6850dc9ba70\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.250373 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-utilities\") pod \"60001ff0-557c-4234-b34d-b6850dc9ba70\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.250524 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-catalog-content\") pod \"60001ff0-557c-4234-b34d-b6850dc9ba70\" (UID: \"60001ff0-557c-4234-b34d-b6850dc9ba70\") " Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.251384 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-utilities" (OuterVolumeSpecName: "utilities") pod "60001ff0-557c-4234-b34d-b6850dc9ba70" (UID: "60001ff0-557c-4234-b34d-b6850dc9ba70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.259010 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60001ff0-557c-4234-b34d-b6850dc9ba70-kube-api-access-pgrqm" (OuterVolumeSpecName: "kube-api-access-pgrqm") pod "60001ff0-557c-4234-b34d-b6850dc9ba70" (UID: "60001ff0-557c-4234-b34d-b6850dc9ba70"). InnerVolumeSpecName "kube-api-access-pgrqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.310034 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60001ff0-557c-4234-b34d-b6850dc9ba70" (UID: "60001ff0-557c-4234-b34d-b6850dc9ba70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.353412 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.353459 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgrqm\" (UniqueName: \"kubernetes.io/projected/60001ff0-557c-4234-b34d-b6850dc9ba70-kube-api-access-pgrqm\") on node \"crc\" DevicePath \"\"" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.353471 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60001ff0-557c-4234-b34d-b6850dc9ba70-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.604260 4878 generic.go:334] "Generic (PLEG): container finished" podID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerID="df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78" exitCode=0 Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.604339 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc2ph" event={"ID":"60001ff0-557c-4234-b34d-b6850dc9ba70","Type":"ContainerDied","Data":"df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78"} Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.604372 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc2ph" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.604396 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc2ph" event={"ID":"60001ff0-557c-4234-b34d-b6850dc9ba70","Type":"ContainerDied","Data":"b645df1c36be3353f6f1f3b44e8d04082f09d4a311d77e7b861a9875b897db20"} Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.604431 4878 scope.go:117] "RemoveContainer" containerID="df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.636399 4878 scope.go:117] "RemoveContainer" containerID="27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.659972 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc2ph"] Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.671118 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mc2ph"] Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.682799 4878 scope.go:117] "RemoveContainer" containerID="fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.742174 4878 scope.go:117] "RemoveContainer" containerID="df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78" Dec 02 19:57:01 crc kubenswrapper[4878]: E1202 19:57:01.742995 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78\": container with ID starting with df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78 not found: ID does not exist" containerID="df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.743026 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78"} err="failed to get container status \"df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78\": rpc error: code = NotFound desc = could not find container \"df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78\": container with ID starting with df94e8307ad94ffa8f523df4f9fbc2895a84814c402598f11f347baa37914c78 not found: ID does not exist" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.743047 4878 scope.go:117] "RemoveContainer" containerID="27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51" Dec 02 19:57:01 crc kubenswrapper[4878]: E1202 19:57:01.743377 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51\": container with ID starting with 27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51 not found: ID does not exist" containerID="27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.743400 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51"} err="failed to get container status \"27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51\": rpc error: code = NotFound desc = could not find container \"27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51\": container with ID starting with 27dc50ab1bff833a47504007de631dd52c7f3d1ad8bc373d5e0fd149d4be8a51 not found: ID does not exist" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.743412 4878 scope.go:117] "RemoveContainer" containerID="fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7" Dec 02 19:57:01 crc kubenswrapper[4878]: E1202 19:57:01.743707 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7\": container with ID starting with fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7 not found: ID does not exist" containerID="fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7" Dec 02 19:57:01 crc kubenswrapper[4878]: I1202 19:57:01.743728 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7"} err="failed to get container status \"fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7\": rpc error: code = NotFound desc = could not find container \"fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7\": container with ID starting with fb45017562961b28ce60a02ee92a4de3baf38dc0c48307ad42d768e2f5e447c7 not found: ID does not exist" Dec 02 19:57:02 crc kubenswrapper[4878]: I1202 19:57:02.951677 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" path="/var/lib/kubelet/pods/60001ff0-557c-4234-b34d-b6850dc9ba70/volumes" Dec 02 19:57:05 crc kubenswrapper[4878]: I1202 19:57:05.834668 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:57:05 crc kubenswrapper[4878]: I1202 19:57:05.885301 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:57:05 crc kubenswrapper[4878]: I1202 19:57:05.937836 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:57:05 crc kubenswrapper[4878]: E1202 19:57:05.938299 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:57:06 crc kubenswrapper[4878]: I1202 19:57:06.075624 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lrrh2"] Dec 02 19:57:07 crc kubenswrapper[4878]: I1202 19:57:07.692190 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lrrh2" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="registry-server" containerID="cri-o://b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0" gracePeriod=2 Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.246816 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.354701 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-catalog-content\") pod \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.356574 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-utilities\") pod \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.356781 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8g42\" (UniqueName: \"kubernetes.io/projected/0df355b1-ad96-47dc-b237-33b01ceb5ad3-kube-api-access-b8g42\") pod \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\" (UID: \"0df355b1-ad96-47dc-b237-33b01ceb5ad3\") " Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.357257 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-utilities" (OuterVolumeSpecName: "utilities") pod "0df355b1-ad96-47dc-b237-33b01ceb5ad3" (UID: "0df355b1-ad96-47dc-b237-33b01ceb5ad3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.358197 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.365085 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df355b1-ad96-47dc-b237-33b01ceb5ad3-kube-api-access-b8g42" (OuterVolumeSpecName: "kube-api-access-b8g42") pod "0df355b1-ad96-47dc-b237-33b01ceb5ad3" (UID: "0df355b1-ad96-47dc-b237-33b01ceb5ad3"). InnerVolumeSpecName "kube-api-access-b8g42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.411618 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0df355b1-ad96-47dc-b237-33b01ceb5ad3" (UID: "0df355b1-ad96-47dc-b237-33b01ceb5ad3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.461773 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0df355b1-ad96-47dc-b237-33b01ceb5ad3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.461809 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8g42\" (UniqueName: \"kubernetes.io/projected/0df355b1-ad96-47dc-b237-33b01ceb5ad3-kube-api-access-b8g42\") on node \"crc\" DevicePath \"\"" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.704454 4878 generic.go:334] "Generic (PLEG): container finished" podID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerID="b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0" exitCode=0 Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.704498 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrrh2" event={"ID":"0df355b1-ad96-47dc-b237-33b01ceb5ad3","Type":"ContainerDied","Data":"b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0"} Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.704507 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrrh2" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.704528 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrrh2" event={"ID":"0df355b1-ad96-47dc-b237-33b01ceb5ad3","Type":"ContainerDied","Data":"40d8da8032a2828e56f366459544fce4f907fe951284454c8dac1538dda19c44"} Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.704544 4878 scope.go:117] "RemoveContainer" containerID="b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.736417 4878 scope.go:117] "RemoveContainer" containerID="bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.750608 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lrrh2"] Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.763904 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lrrh2"] Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.772183 4878 scope.go:117] "RemoveContainer" containerID="80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.837020 4878 scope.go:117] "RemoveContainer" containerID="b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0" Dec 02 19:57:08 crc kubenswrapper[4878]: E1202 19:57:08.838839 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0\": container with ID starting with b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0 not found: ID does not exist" containerID="b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.838875 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0"} err="failed to get container status \"b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0\": rpc error: code = NotFound desc = could not find container \"b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0\": container with ID starting with b324dc4c02278ad1f349cf6aa86a304c6bdc36e14f6c5b8951e8919e5960bed0 not found: ID does not exist" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.838901 4878 scope.go:117] "RemoveContainer" containerID="bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae" Dec 02 19:57:08 crc kubenswrapper[4878]: E1202 19:57:08.839277 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae\": container with ID starting with bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae not found: ID does not exist" containerID="bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.839297 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae"} err="failed to get container status \"bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae\": rpc error: code = NotFound desc = could not find container \"bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae\": container with ID starting with bb0a13f6ff3ac676d34507b2ea0585b7e7e9f4a28a5f69c3f677be8c33bfaeae not found: ID does not exist" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.839309 4878 scope.go:117] "RemoveContainer" containerID="80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517" Dec 02 19:57:08 crc kubenswrapper[4878]: E1202 19:57:08.839639 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517\": container with ID starting with 80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517 not found: ID does not exist" containerID="80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.839657 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517"} err="failed to get container status \"80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517\": rpc error: code = NotFound desc = could not find container \"80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517\": container with ID starting with 80fd3f1db6682c796b839bbec813624abea3d35c568144e0b21e09e7a2b48517 not found: ID does not exist" Dec 02 19:57:08 crc kubenswrapper[4878]: I1202 19:57:08.984587 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" path="/var/lib/kubelet/pods/0df355b1-ad96-47dc-b237-33b01ceb5ad3/volumes" Dec 02 19:57:10 crc kubenswrapper[4878]: I1202 19:57:10.501508 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-ffbv7_302b5d81-5163-4052-a986-6fbdda49e9cf/prometheus-operator/0.log" Dec 02 19:57:10 crc kubenswrapper[4878]: I1202 19:57:10.667805 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_baa1d367-077f-4aa3-8dca-5a56cff08838/prometheus-operator-admission-webhook/0.log" Dec 02 19:57:10 crc kubenswrapper[4878]: I1202 19:57:10.676960 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_95fa4f57-a446-402b-9de4-5ff0d8109802/prometheus-operator-admission-webhook/0.log" Dec 02 19:57:10 crc kubenswrapper[4878]: I1202 19:57:10.909392 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-ctrvg_fee86c2f-ee2c-49c4-a96c-f59e7ef28524/operator/0.log" Dec 02 19:57:10 crc kubenswrapper[4878]: I1202 19:57:10.950078 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-zhq9q_528e5b70-2773-48c2-8382-d4e2ec45933d/observability-ui-dashboards/0.log" Dec 02 19:57:11 crc kubenswrapper[4878]: I1202 19:57:11.090939 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-s7xc9_161358a3-71af-4def-b6a0-0ba9b5f2a7b3/perses-operator/0.log" Dec 02 19:57:18 crc kubenswrapper[4878]: I1202 19:57:18.939892 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:57:18 crc kubenswrapper[4878]: E1202 19:57:18.941109 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:57:24 crc kubenswrapper[4878]: I1202 19:57:24.082738 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/kube-rbac-proxy/0.log" Dec 02 19:57:24 crc kubenswrapper[4878]: I1202 19:57:24.110123 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/manager/0.log" Dec 02 19:57:31 crc kubenswrapper[4878]: I1202 19:57:31.938414 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:57:31 crc kubenswrapper[4878]: E1202 19:57:31.939169 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:57:45 crc kubenswrapper[4878]: I1202 19:57:45.938701 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:57:45 crc kubenswrapper[4878]: E1202 19:57:45.939465 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:57:58 crc kubenswrapper[4878]: I1202 19:57:58.940938 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:57:58 crc kubenswrapper[4878]: E1202 19:57:58.942313 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:58:11 crc kubenswrapper[4878]: I1202 19:58:11.938238 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:58:11 crc kubenswrapper[4878]: E1202 19:58:11.939587 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:58:24 crc kubenswrapper[4878]: I1202 19:58:24.938788 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:58:24 crc kubenswrapper[4878]: E1202 19:58:24.940337 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:58:35 crc kubenswrapper[4878]: I1202 19:58:35.939114 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:58:36 crc kubenswrapper[4878]: E1202 19:58:35.941732 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:58:48 crc kubenswrapper[4878]: I1202 19:58:48.942175 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:58:48 crc kubenswrapper[4878]: E1202 19:58:48.943260 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:59:01 crc kubenswrapper[4878]: I1202 19:59:01.938529 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:59:01 crc kubenswrapper[4878]: E1202 19:59:01.939749 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:59:13 crc kubenswrapper[4878]: I1202 19:59:13.329613 4878 generic.go:334] "Generic (PLEG): container finished" podID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerID="56c8b049e11d9d998df776c72449d4b7e241e1ffebcdc6279bde6285bbd5ba73" exitCode=0 Dec 02 19:59:13 crc kubenswrapper[4878]: I1202 19:59:13.329758 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" event={"ID":"8810f12f-a3e8-4168-bc6f-75395a2845fb","Type":"ContainerDied","Data":"56c8b049e11d9d998df776c72449d4b7e241e1ffebcdc6279bde6285bbd5ba73"} Dec 02 19:59:13 crc kubenswrapper[4878]: I1202 19:59:13.330897 4878 scope.go:117] "RemoveContainer" containerID="56c8b049e11d9d998df776c72449d4b7e241e1ffebcdc6279bde6285bbd5ba73" Dec 02 19:59:14 crc kubenswrapper[4878]: I1202 19:59:14.297551 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6qmbf_must-gather-ntpcx_8810f12f-a3e8-4168-bc6f-75395a2845fb/gather/0.log" Dec 02 19:59:14 crc kubenswrapper[4878]: I1202 19:59:14.938145 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:59:14 crc kubenswrapper[4878]: E1202 19:59:14.938536 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.278792 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qmbf/must-gather-ntpcx"] Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.279513 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerName="copy" containerID="cri-o://86ff891616fd4e933569d0705663ae73cbdcaa6d3674f7027ff135016ffaf279" gracePeriod=2 Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.292311 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qmbf/must-gather-ntpcx"] Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.457809 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6qmbf_must-gather-ntpcx_8810f12f-a3e8-4168-bc6f-75395a2845fb/copy/0.log" Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.458471 4878 generic.go:334] "Generic (PLEG): container finished" podID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerID="86ff891616fd4e933569d0705663ae73cbdcaa6d3674f7027ff135016ffaf279" exitCode=143 Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.764307 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6qmbf_must-gather-ntpcx_8810f12f-a3e8-4168-bc6f-75395a2845fb/copy/0.log" Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.764828 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.865559 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w98vj\" (UniqueName: \"kubernetes.io/projected/8810f12f-a3e8-4168-bc6f-75395a2845fb-kube-api-access-w98vj\") pod \"8810f12f-a3e8-4168-bc6f-75395a2845fb\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.865697 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8810f12f-a3e8-4168-bc6f-75395a2845fb-must-gather-output\") pod \"8810f12f-a3e8-4168-bc6f-75395a2845fb\" (UID: \"8810f12f-a3e8-4168-bc6f-75395a2845fb\") " Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.876520 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8810f12f-a3e8-4168-bc6f-75395a2845fb-kube-api-access-w98vj" (OuterVolumeSpecName: "kube-api-access-w98vj") pod "8810f12f-a3e8-4168-bc6f-75395a2845fb" (UID: "8810f12f-a3e8-4168-bc6f-75395a2845fb"). InnerVolumeSpecName "kube-api-access-w98vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:59:22 crc kubenswrapper[4878]: I1202 19:59:22.968841 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w98vj\" (UniqueName: \"kubernetes.io/projected/8810f12f-a3e8-4168-bc6f-75395a2845fb-kube-api-access-w98vj\") on node \"crc\" DevicePath \"\"" Dec 02 19:59:23 crc kubenswrapper[4878]: I1202 19:59:23.051382 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8810f12f-a3e8-4168-bc6f-75395a2845fb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8810f12f-a3e8-4168-bc6f-75395a2845fb" (UID: "8810f12f-a3e8-4168-bc6f-75395a2845fb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:59:23 crc kubenswrapper[4878]: I1202 19:59:23.070925 4878 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8810f12f-a3e8-4168-bc6f-75395a2845fb-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 19:59:23 crc kubenswrapper[4878]: I1202 19:59:23.477936 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6qmbf_must-gather-ntpcx_8810f12f-a3e8-4168-bc6f-75395a2845fb/copy/0.log" Dec 02 19:59:23 crc kubenswrapper[4878]: I1202 19:59:23.479996 4878 scope.go:117] "RemoveContainer" containerID="86ff891616fd4e933569d0705663ae73cbdcaa6d3674f7027ff135016ffaf279" Dec 02 19:59:23 crc kubenswrapper[4878]: I1202 19:59:23.480217 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qmbf/must-gather-ntpcx" Dec 02 19:59:23 crc kubenswrapper[4878]: I1202 19:59:23.518812 4878 scope.go:117] "RemoveContainer" containerID="56c8b049e11d9d998df776c72449d4b7e241e1ffebcdc6279bde6285bbd5ba73" Dec 02 19:59:24 crc kubenswrapper[4878]: I1202 19:59:24.963168 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" path="/var/lib/kubelet/pods/8810f12f-a3e8-4168-bc6f-75395a2845fb/volumes" Dec 02 19:59:29 crc kubenswrapper[4878]: I1202 19:59:29.938203 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:59:29 crc kubenswrapper[4878]: E1202 19:59:29.939386 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:59:40 crc kubenswrapper[4878]: I1202 19:59:40.949196 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:59:40 crc kubenswrapper[4878]: E1202 19:59:40.950192 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 19:59:52 crc kubenswrapper[4878]: I1202 19:59:52.938077 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 19:59:52 crc kubenswrapper[4878]: E1202 19:59:52.938951 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.176734 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq"] Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178325 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="registry-server" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178341 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="registry-server" Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178376 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="extract-utilities" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178382 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="extract-utilities" Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178404 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="extract-utilities" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178411 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="extract-utilities" Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178439 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerName="gather" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178445 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerName="gather" Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178475 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="extract-content" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178481 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="extract-content" Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178503 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="extract-content" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178509 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="extract-content" Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178525 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="registry-server" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178531 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="registry-server" Dec 02 20:00:00 crc kubenswrapper[4878]: E1202 20:00:00.178543 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerName="copy" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.178549 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerName="copy" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.179026 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df355b1-ad96-47dc-b237-33b01ceb5ad3" containerName="registry-server" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.179056 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerName="copy" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.179073 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="8810f12f-a3e8-4168-bc6f-75395a2845fb" containerName="gather" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.179098 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="60001ff0-557c-4234-b34d-b6850dc9ba70" containerName="registry-server" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.180212 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.199548 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.202564 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq"] Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.207488 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.315832 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt87m\" (UniqueName: \"kubernetes.io/projected/a758e6ac-2069-4f62-acc8-aaf83eece904-kube-api-access-qt87m\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.316017 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a758e6ac-2069-4f62-acc8-aaf83eece904-secret-volume\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.316114 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a758e6ac-2069-4f62-acc8-aaf83eece904-config-volume\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.418555 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a758e6ac-2069-4f62-acc8-aaf83eece904-config-volume\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.418721 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt87m\" (UniqueName: \"kubernetes.io/projected/a758e6ac-2069-4f62-acc8-aaf83eece904-kube-api-access-qt87m\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.418875 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a758e6ac-2069-4f62-acc8-aaf83eece904-secret-volume\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.419998 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a758e6ac-2069-4f62-acc8-aaf83eece904-config-volume\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.424971 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a758e6ac-2069-4f62-acc8-aaf83eece904-secret-volume\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.441620 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt87m\" (UniqueName: \"kubernetes.io/projected/a758e6ac-2069-4f62-acc8-aaf83eece904-kube-api-access-qt87m\") pod \"collect-profiles-29411760-tqfsq\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.524040 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.836710 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq"] Dec 02 20:00:00 crc kubenswrapper[4878]: I1202 20:00:00.923867 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" event={"ID":"a758e6ac-2069-4f62-acc8-aaf83eece904","Type":"ContainerStarted","Data":"62ebfdd3295eda662fde3ba804f6df7726ff8d5a463d9af83e1da8dc6b70e38f"} Dec 02 20:00:01 crc kubenswrapper[4878]: I1202 20:00:01.992835 4878 generic.go:334] "Generic (PLEG): container finished" podID="a758e6ac-2069-4f62-acc8-aaf83eece904" containerID="a1f12df31f77076a8db47f2294c52e237048a4ffd8d50f0bdb51ca7edfb90cb7" exitCode=0 Dec 02 20:00:01 crc kubenswrapper[4878]: I1202 20:00:01.992906 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" event={"ID":"a758e6ac-2069-4f62-acc8-aaf83eece904","Type":"ContainerDied","Data":"a1f12df31f77076a8db47f2294c52e237048a4ffd8d50f0bdb51ca7edfb90cb7"} Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.506685 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.655781 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a758e6ac-2069-4f62-acc8-aaf83eece904-config-volume\") pod \"a758e6ac-2069-4f62-acc8-aaf83eece904\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.655925 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt87m\" (UniqueName: \"kubernetes.io/projected/a758e6ac-2069-4f62-acc8-aaf83eece904-kube-api-access-qt87m\") pod \"a758e6ac-2069-4f62-acc8-aaf83eece904\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.656105 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a758e6ac-2069-4f62-acc8-aaf83eece904-secret-volume\") pod \"a758e6ac-2069-4f62-acc8-aaf83eece904\" (UID: \"a758e6ac-2069-4f62-acc8-aaf83eece904\") " Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.656477 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a758e6ac-2069-4f62-acc8-aaf83eece904-config-volume" (OuterVolumeSpecName: "config-volume") pod "a758e6ac-2069-4f62-acc8-aaf83eece904" (UID: "a758e6ac-2069-4f62-acc8-aaf83eece904"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.656898 4878 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a758e6ac-2069-4f62-acc8-aaf83eece904-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.669695 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a758e6ac-2069-4f62-acc8-aaf83eece904-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a758e6ac-2069-4f62-acc8-aaf83eece904" (UID: "a758e6ac-2069-4f62-acc8-aaf83eece904"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.672572 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a758e6ac-2069-4f62-acc8-aaf83eece904-kube-api-access-qt87m" (OuterVolumeSpecName: "kube-api-access-qt87m") pod "a758e6ac-2069-4f62-acc8-aaf83eece904" (UID: "a758e6ac-2069-4f62-acc8-aaf83eece904"). InnerVolumeSpecName "kube-api-access-qt87m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.758853 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt87m\" (UniqueName: \"kubernetes.io/projected/a758e6ac-2069-4f62-acc8-aaf83eece904-kube-api-access-qt87m\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:03 crc kubenswrapper[4878]: I1202 20:00:03.758898 4878 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a758e6ac-2069-4f62-acc8-aaf83eece904-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:04 crc kubenswrapper[4878]: I1202 20:00:04.019003 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" event={"ID":"a758e6ac-2069-4f62-acc8-aaf83eece904","Type":"ContainerDied","Data":"62ebfdd3295eda662fde3ba804f6df7726ff8d5a463d9af83e1da8dc6b70e38f"} Dec 02 20:00:04 crc kubenswrapper[4878]: I1202 20:00:04.019061 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62ebfdd3295eda662fde3ba804f6df7726ff8d5a463d9af83e1da8dc6b70e38f" Dec 02 20:00:04 crc kubenswrapper[4878]: I1202 20:00:04.019077 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-tqfsq" Dec 02 20:00:04 crc kubenswrapper[4878]: I1202 20:00:04.611278 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl"] Dec 02 20:00:04 crc kubenswrapper[4878]: I1202 20:00:04.623391 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411715-799pl"] Dec 02 20:00:04 crc kubenswrapper[4878]: I1202 20:00:04.955014 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b65c1eb-b94e-4808-a886-ebf6d4452d04" path="/var/lib/kubelet/pods/2b65c1eb-b94e-4808-a886-ebf6d4452d04/volumes" Dec 02 20:00:05 crc kubenswrapper[4878]: I1202 20:00:05.938028 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 20:00:06 crc kubenswrapper[4878]: I1202 20:00:06.815942 4878 scope.go:117] "RemoveContainer" containerID="1cf202fb7efe6833ae1025b7d6a76d4839ad26a6e95a5f1c54502c511bcde11a" Dec 02 20:00:06 crc kubenswrapper[4878]: I1202 20:00:06.851626 4878 scope.go:117] "RemoveContainer" containerID="f2b5a0d9b5a0464eef32d58a5307e1853a054205d673e8a5780d5e07f056529d" Dec 02 20:00:07 crc kubenswrapper[4878]: I1202 20:00:07.066335 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"f659bcb967b3ab0df4defdcfba9bdeebb874762a37661bdd5221e1047bf9dd43"} Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.170883 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411761-jk55l"] Dec 02 20:01:00 crc kubenswrapper[4878]: E1202 20:01:00.172128 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a758e6ac-2069-4f62-acc8-aaf83eece904" containerName="collect-profiles" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.172145 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a758e6ac-2069-4f62-acc8-aaf83eece904" containerName="collect-profiles" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.172477 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="a758e6ac-2069-4f62-acc8-aaf83eece904" containerName="collect-profiles" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.173874 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.188508 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411761-jk55l"] Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.289609 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9tg\" (UniqueName: \"kubernetes.io/projected/803db126-60e5-4c26-9411-fe06a9b4c9cd-kube-api-access-ng9tg\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.289678 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-combined-ca-bundle\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.289802 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-fernet-keys\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.289879 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-config-data\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.391453 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9tg\" (UniqueName: \"kubernetes.io/projected/803db126-60e5-4c26-9411-fe06a9b4c9cd-kube-api-access-ng9tg\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.391537 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-combined-ca-bundle\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.391627 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-fernet-keys\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.391708 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-config-data\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.397864 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-combined-ca-bundle\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.399606 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-fernet-keys\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.404950 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-config-data\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.413761 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9tg\" (UniqueName: \"kubernetes.io/projected/803db126-60e5-4c26-9411-fe06a9b4c9cd-kube-api-access-ng9tg\") pod \"keystone-cron-29411761-jk55l\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:00 crc kubenswrapper[4878]: I1202 20:01:00.502388 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:01 crc kubenswrapper[4878]: I1202 20:01:01.035566 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411761-jk55l"] Dec 02 20:01:01 crc kubenswrapper[4878]: I1202 20:01:01.737722 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411761-jk55l" event={"ID":"803db126-60e5-4c26-9411-fe06a9b4c9cd","Type":"ContainerStarted","Data":"53b735432c7b03f96a9fd3e39337a04d70813eb94518281037bdcbfab2defdbd"} Dec 02 20:01:01 crc kubenswrapper[4878]: I1202 20:01:01.739681 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411761-jk55l" event={"ID":"803db126-60e5-4c26-9411-fe06a9b4c9cd","Type":"ContainerStarted","Data":"b686e47c58d633c69283740e11c14cd6cd9369a5c6cd71b1726aa64dfd264822"} Dec 02 20:01:01 crc kubenswrapper[4878]: I1202 20:01:01.766850 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411761-jk55l" podStartSLOduration=1.766817192 podStartE2EDuration="1.766817192s" podCreationTimestamp="2025-12-02 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:01:01.753808295 +0000 UTC m=+6371.443427216" watchObservedRunningTime="2025-12-02 20:01:01.766817192 +0000 UTC m=+6371.456436113" Dec 02 20:01:03 crc kubenswrapper[4878]: I1202 20:01:03.779354 4878 generic.go:334] "Generic (PLEG): container finished" podID="803db126-60e5-4c26-9411-fe06a9b4c9cd" containerID="53b735432c7b03f96a9fd3e39337a04d70813eb94518281037bdcbfab2defdbd" exitCode=0 Dec 02 20:01:03 crc kubenswrapper[4878]: I1202 20:01:03.779457 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411761-jk55l" event={"ID":"803db126-60e5-4c26-9411-fe06a9b4c9cd","Type":"ContainerDied","Data":"53b735432c7b03f96a9fd3e39337a04d70813eb94518281037bdcbfab2defdbd"} Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.274010 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.446719 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-fernet-keys\") pod \"803db126-60e5-4c26-9411-fe06a9b4c9cd\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.446843 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-config-data\") pod \"803db126-60e5-4c26-9411-fe06a9b4c9cd\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.447050 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-combined-ca-bundle\") pod \"803db126-60e5-4c26-9411-fe06a9b4c9cd\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.447170 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng9tg\" (UniqueName: \"kubernetes.io/projected/803db126-60e5-4c26-9411-fe06a9b4c9cd-kube-api-access-ng9tg\") pod \"803db126-60e5-4c26-9411-fe06a9b4c9cd\" (UID: \"803db126-60e5-4c26-9411-fe06a9b4c9cd\") " Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.457637 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "803db126-60e5-4c26-9411-fe06a9b4c9cd" (UID: "803db126-60e5-4c26-9411-fe06a9b4c9cd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.457768 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803db126-60e5-4c26-9411-fe06a9b4c9cd-kube-api-access-ng9tg" (OuterVolumeSpecName: "kube-api-access-ng9tg") pod "803db126-60e5-4c26-9411-fe06a9b4c9cd" (UID: "803db126-60e5-4c26-9411-fe06a9b4c9cd"). InnerVolumeSpecName "kube-api-access-ng9tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.486099 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "803db126-60e5-4c26-9411-fe06a9b4c9cd" (UID: "803db126-60e5-4c26-9411-fe06a9b4c9cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.538717 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-config-data" (OuterVolumeSpecName: "config-data") pod "803db126-60e5-4c26-9411-fe06a9b4c9cd" (UID: "803db126-60e5-4c26-9411-fe06a9b4c9cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.550613 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng9tg\" (UniqueName: \"kubernetes.io/projected/803db126-60e5-4c26-9411-fe06a9b4c9cd-kube-api-access-ng9tg\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.551214 4878 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.551427 4878 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.551568 4878 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803db126-60e5-4c26-9411-fe06a9b4c9cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.805827 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411761-jk55l" event={"ID":"803db126-60e5-4c26-9411-fe06a9b4c9cd","Type":"ContainerDied","Data":"b686e47c58d633c69283740e11c14cd6cd9369a5c6cd71b1726aa64dfd264822"} Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.805874 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411761-jk55l" Dec 02 20:01:05 crc kubenswrapper[4878]: I1202 20:01:05.805875 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b686e47c58d633c69283740e11c14cd6cd9369a5c6cd71b1726aa64dfd264822" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.292212 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vz8qb"] Dec 02 20:01:51 crc kubenswrapper[4878]: E1202 20:01:51.294624 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803db126-60e5-4c26-9411-fe06a9b4c9cd" containerName="keystone-cron" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.294665 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="803db126-60e5-4c26-9411-fe06a9b4c9cd" containerName="keystone-cron" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.295367 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="803db126-60e5-4c26-9411-fe06a9b4c9cd" containerName="keystone-cron" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.301121 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.303957 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vz8qb"] Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.438952 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-catalog-content\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.439031 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-utilities\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.440283 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzjj\" (UniqueName: \"kubernetes.io/projected/b4bcbf04-6f94-417d-a660-5ae12df453f6-kube-api-access-kzzjj\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.542442 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzjj\" (UniqueName: \"kubernetes.io/projected/b4bcbf04-6f94-417d-a660-5ae12df453f6-kube-api-access-kzzjj\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.542835 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-catalog-content\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.542957 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-utilities\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.543679 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-utilities\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.543899 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-catalog-content\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.565763 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzjj\" (UniqueName: \"kubernetes.io/projected/b4bcbf04-6f94-417d-a660-5ae12df453f6-kube-api-access-kzzjj\") pod \"redhat-operators-vz8qb\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:51 crc kubenswrapper[4878]: I1202 20:01:51.631299 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:01:52 crc kubenswrapper[4878]: I1202 20:01:52.119427 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vz8qb"] Dec 02 20:01:52 crc kubenswrapper[4878]: I1202 20:01:52.394032 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vz8qb" event={"ID":"b4bcbf04-6f94-417d-a660-5ae12df453f6","Type":"ContainerStarted","Data":"548a6cff817fd8dd58e3475a49e84a7329d7a9857a2c330d76232c146d190b6b"} Dec 02 20:01:53 crc kubenswrapper[4878]: I1202 20:01:53.409263 4878 generic.go:334] "Generic (PLEG): container finished" podID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerID="c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48" exitCode=0 Dec 02 20:01:53 crc kubenswrapper[4878]: I1202 20:01:53.409788 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vz8qb" event={"ID":"b4bcbf04-6f94-417d-a660-5ae12df453f6","Type":"ContainerDied","Data":"c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48"} Dec 02 20:01:53 crc kubenswrapper[4878]: I1202 20:01:53.412480 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:01:55 crc kubenswrapper[4878]: I1202 20:01:55.429487 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vz8qb" event={"ID":"b4bcbf04-6f94-417d-a660-5ae12df453f6","Type":"ContainerStarted","Data":"9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04"} Dec 02 20:02:00 crc kubenswrapper[4878]: I1202 20:02:00.482460 4878 generic.go:334] "Generic (PLEG): container finished" podID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerID="9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04" exitCode=0 Dec 02 20:02:00 crc kubenswrapper[4878]: I1202 20:02:00.482530 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vz8qb" event={"ID":"b4bcbf04-6f94-417d-a660-5ae12df453f6","Type":"ContainerDied","Data":"9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04"} Dec 02 20:02:04 crc kubenswrapper[4878]: I1202 20:02:04.538187 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vz8qb" event={"ID":"b4bcbf04-6f94-417d-a660-5ae12df453f6","Type":"ContainerStarted","Data":"2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d"} Dec 02 20:02:04 crc kubenswrapper[4878]: I1202 20:02:04.572400 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vz8qb" podStartSLOduration=3.520401982 podStartE2EDuration="13.572376653s" podCreationTimestamp="2025-12-02 20:01:51 +0000 UTC" firstStartedPulling="2025-12-02 20:01:53.412115024 +0000 UTC m=+6423.101733915" lastFinishedPulling="2025-12-02 20:02:03.464089705 +0000 UTC m=+6433.153708586" observedRunningTime="2025-12-02 20:02:04.558337725 +0000 UTC m=+6434.247956646" watchObservedRunningTime="2025-12-02 20:02:04.572376653 +0000 UTC m=+6434.261995544" Dec 02 20:02:11 crc kubenswrapper[4878]: I1202 20:02:11.631672 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:02:11 crc kubenswrapper[4878]: I1202 20:02:11.632223 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:02:12 crc kubenswrapper[4878]: I1202 20:02:12.688488 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vz8qb" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="registry-server" probeResult="failure" output=< Dec 02 20:02:12 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 20:02:12 crc kubenswrapper[4878]: > Dec 02 20:02:21 crc kubenswrapper[4878]: I1202 20:02:21.687814 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:02:21 crc kubenswrapper[4878]: I1202 20:02:21.759422 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:02:22 crc kubenswrapper[4878]: I1202 20:02:22.489663 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vz8qb"] Dec 02 20:02:22 crc kubenswrapper[4878]: I1202 20:02:22.797725 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vz8qb" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="registry-server" containerID="cri-o://2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d" gracePeriod=2 Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.332328 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.502502 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-catalog-content\") pod \"b4bcbf04-6f94-417d-a660-5ae12df453f6\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.502782 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzzjj\" (UniqueName: \"kubernetes.io/projected/b4bcbf04-6f94-417d-a660-5ae12df453f6-kube-api-access-kzzjj\") pod \"b4bcbf04-6f94-417d-a660-5ae12df453f6\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.502880 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-utilities\") pod \"b4bcbf04-6f94-417d-a660-5ae12df453f6\" (UID: \"b4bcbf04-6f94-417d-a660-5ae12df453f6\") " Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.503559 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-utilities" (OuterVolumeSpecName: "utilities") pod "b4bcbf04-6f94-417d-a660-5ae12df453f6" (UID: "b4bcbf04-6f94-417d-a660-5ae12df453f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.503966 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.509486 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bcbf04-6f94-417d-a660-5ae12df453f6-kube-api-access-kzzjj" (OuterVolumeSpecName: "kube-api-access-kzzjj") pod "b4bcbf04-6f94-417d-a660-5ae12df453f6" (UID: "b4bcbf04-6f94-417d-a660-5ae12df453f6"). InnerVolumeSpecName "kube-api-access-kzzjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.606174 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzzjj\" (UniqueName: \"kubernetes.io/projected/b4bcbf04-6f94-417d-a660-5ae12df453f6-kube-api-access-kzzjj\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.652307 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4bcbf04-6f94-417d-a660-5ae12df453f6" (UID: "b4bcbf04-6f94-417d-a660-5ae12df453f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.707445 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bcbf04-6f94-417d-a660-5ae12df453f6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.742470 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.742535 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.810975 4878 generic.go:334] "Generic (PLEG): container finished" podID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerID="2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d" exitCode=0 Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.811018 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vz8qb" event={"ID":"b4bcbf04-6f94-417d-a660-5ae12df453f6","Type":"ContainerDied","Data":"2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d"} Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.811042 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vz8qb" event={"ID":"b4bcbf04-6f94-417d-a660-5ae12df453f6","Type":"ContainerDied","Data":"548a6cff817fd8dd58e3475a49e84a7329d7a9857a2c330d76232c146d190b6b"} Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.811058 4878 scope.go:117] "RemoveContainer" containerID="2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.811075 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vz8qb" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.851439 4878 scope.go:117] "RemoveContainer" containerID="9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.854326 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vz8qb"] Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.867344 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vz8qb"] Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.878120 4878 scope.go:117] "RemoveContainer" containerID="c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.948939 4878 scope.go:117] "RemoveContainer" containerID="2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d" Dec 02 20:02:23 crc kubenswrapper[4878]: E1202 20:02:23.950847 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d\": container with ID starting with 2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d not found: ID does not exist" containerID="2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.950883 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d"} err="failed to get container status \"2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d\": rpc error: code = NotFound desc = could not find container \"2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d\": container with ID starting with 2629cb7a720681506027aee3264a9a6f6375214155cc5510173aa5a50738857d not found: ID does not exist" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.950910 4878 scope.go:117] "RemoveContainer" containerID="9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04" Dec 02 20:02:23 crc kubenswrapper[4878]: E1202 20:02:23.951310 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04\": container with ID starting with 9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04 not found: ID does not exist" containerID="9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.951369 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04"} err="failed to get container status \"9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04\": rpc error: code = NotFound desc = could not find container \"9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04\": container with ID starting with 9f2d78e310fff0fb4537162fc204e1dded3bb91da454a2ae2e1ed0f2b3c81c04 not found: ID does not exist" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.951404 4878 scope.go:117] "RemoveContainer" containerID="c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48" Dec 02 20:02:23 crc kubenswrapper[4878]: E1202 20:02:23.951859 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48\": container with ID starting with c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48 not found: ID does not exist" containerID="c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48" Dec 02 20:02:23 crc kubenswrapper[4878]: I1202 20:02:23.951895 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48"} err="failed to get container status \"c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48\": rpc error: code = NotFound desc = could not find container \"c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48\": container with ID starting with c45adaec8f263746f16c2db519d1353d675ec28f3edf571bf3d8ba5e00157a48 not found: ID does not exist" Dec 02 20:02:24 crc kubenswrapper[4878]: I1202 20:02:24.954194 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" path="/var/lib/kubelet/pods/b4bcbf04-6f94-417d-a660-5ae12df453f6/volumes" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.329059 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8sbrm/must-gather-jktfm"] Dec 02 20:02:39 crc kubenswrapper[4878]: E1202 20:02:39.330041 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="registry-server" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.330057 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="registry-server" Dec 02 20:02:39 crc kubenswrapper[4878]: E1202 20:02:39.330113 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="extract-content" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.330122 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="extract-content" Dec 02 20:02:39 crc kubenswrapper[4878]: E1202 20:02:39.330147 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="extract-utilities" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.330156 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="extract-utilities" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.330398 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bcbf04-6f94-417d-a660-5ae12df453f6" containerName="registry-server" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.334761 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.344274 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8sbrm"/"kube-root-ca.crt" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.344274 4878 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8sbrm"/"openshift-service-ca.crt" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.368729 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8sbrm/must-gather-jktfm"] Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.434147 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jj42\" (UniqueName: \"kubernetes.io/projected/1c170e0c-9765-42e1-b77c-6c881ef202fa-kube-api-access-9jj42\") pod \"must-gather-jktfm\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.434470 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c170e0c-9765-42e1-b77c-6c881ef202fa-must-gather-output\") pod \"must-gather-jktfm\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.536206 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jj42\" (UniqueName: \"kubernetes.io/projected/1c170e0c-9765-42e1-b77c-6c881ef202fa-kube-api-access-9jj42\") pod \"must-gather-jktfm\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.536405 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c170e0c-9765-42e1-b77c-6c881ef202fa-must-gather-output\") pod \"must-gather-jktfm\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.536923 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c170e0c-9765-42e1-b77c-6c881ef202fa-must-gather-output\") pod \"must-gather-jktfm\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.563520 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jj42\" (UniqueName: \"kubernetes.io/projected/1c170e0c-9765-42e1-b77c-6c881ef202fa-kube-api-access-9jj42\") pod \"must-gather-jktfm\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:39 crc kubenswrapper[4878]: I1202 20:02:39.690596 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:02:40 crc kubenswrapper[4878]: I1202 20:02:40.205350 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8sbrm/must-gather-jktfm"] Dec 02 20:02:41 crc kubenswrapper[4878]: I1202 20:02:41.049353 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/must-gather-jktfm" event={"ID":"1c170e0c-9765-42e1-b77c-6c881ef202fa","Type":"ContainerStarted","Data":"a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059"} Dec 02 20:02:41 crc kubenswrapper[4878]: I1202 20:02:41.049752 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/must-gather-jktfm" event={"ID":"1c170e0c-9765-42e1-b77c-6c881ef202fa","Type":"ContainerStarted","Data":"f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b"} Dec 02 20:02:41 crc kubenswrapper[4878]: I1202 20:02:41.049761 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/must-gather-jktfm" event={"ID":"1c170e0c-9765-42e1-b77c-6c881ef202fa","Type":"ContainerStarted","Data":"99e963eb8182328c30c1755ec33eb1a95febc2cfb601608b8d4b9a6bae68a693"} Dec 02 20:02:41 crc kubenswrapper[4878]: I1202 20:02:41.068091 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8sbrm/must-gather-jktfm" podStartSLOduration=2.068070516 podStartE2EDuration="2.068070516s" podCreationTimestamp="2025-12-02 20:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:02:41.067198519 +0000 UTC m=+6470.756817430" watchObservedRunningTime="2025-12-02 20:02:41.068070516 +0000 UTC m=+6470.757689397" Dec 02 20:02:44 crc kubenswrapper[4878]: E1202 20:02:44.078916 4878 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.159:48786->38.102.83.159:41745: write tcp 38.102.83.159:48786->38.102.83.159:41745: write: broken pipe Dec 02 20:02:44 crc kubenswrapper[4878]: I1202 20:02:44.917001 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-fxj6f"] Dec 02 20:02:44 crc kubenswrapper[4878]: I1202 20:02:44.919473 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:44 crc kubenswrapper[4878]: I1202 20:02:44.926206 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8sbrm"/"default-dockercfg-bm7gc" Dec 02 20:02:45 crc kubenswrapper[4878]: I1202 20:02:45.036795 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-host\") pod \"crc-debug-fxj6f\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:45 crc kubenswrapper[4878]: I1202 20:02:45.036855 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76ltl\" (UniqueName: \"kubernetes.io/projected/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-kube-api-access-76ltl\") pod \"crc-debug-fxj6f\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:45 crc kubenswrapper[4878]: I1202 20:02:45.138900 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76ltl\" (UniqueName: \"kubernetes.io/projected/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-kube-api-access-76ltl\") pod \"crc-debug-fxj6f\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:45 crc kubenswrapper[4878]: I1202 20:02:45.139152 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-host\") pod \"crc-debug-fxj6f\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:45 crc kubenswrapper[4878]: I1202 20:02:45.139464 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-host\") pod \"crc-debug-fxj6f\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:45 crc kubenswrapper[4878]: I1202 20:02:45.158022 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76ltl\" (UniqueName: \"kubernetes.io/projected/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-kube-api-access-76ltl\") pod \"crc-debug-fxj6f\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:45 crc kubenswrapper[4878]: I1202 20:02:45.326158 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:02:46 crc kubenswrapper[4878]: I1202 20:02:46.106136 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" event={"ID":"be9a5cd4-71e2-4db9-9301-9d02ab4afa86","Type":"ContainerStarted","Data":"e14b69342559563b96fb45fa41b6f8b0357962f2b7c8c6558ef8fa174432fec8"} Dec 02 20:02:46 crc kubenswrapper[4878]: I1202 20:02:46.107086 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" event={"ID":"be9a5cd4-71e2-4db9-9301-9d02ab4afa86","Type":"ContainerStarted","Data":"353acd2ba2e16ba28a3e634d7fe87b11fd58da5d56cee109a09031c5c0146c0c"} Dec 02 20:02:46 crc kubenswrapper[4878]: I1202 20:02:46.128286 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" podStartSLOduration=2.128268245 podStartE2EDuration="2.128268245s" podCreationTimestamp="2025-12-02 20:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:02:46.123371082 +0000 UTC m=+6475.812989963" watchObservedRunningTime="2025-12-02 20:02:46.128268245 +0000 UTC m=+6475.817887126" Dec 02 20:02:53 crc kubenswrapper[4878]: I1202 20:02:53.775117 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:02:53 crc kubenswrapper[4878]: I1202 20:02:53.775590 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:03:23 crc kubenswrapper[4878]: I1202 20:03:23.743264 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:03:23 crc kubenswrapper[4878]: I1202 20:03:23.743860 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:03:23 crc kubenswrapper[4878]: I1202 20:03:23.743926 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 20:03:23 crc kubenswrapper[4878]: I1202 20:03:23.745226 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f659bcb967b3ab0df4defdcfba9bdeebb874762a37661bdd5221e1047bf9dd43"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:03:23 crc kubenswrapper[4878]: I1202 20:03:23.745302 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://f659bcb967b3ab0df4defdcfba9bdeebb874762a37661bdd5221e1047bf9dd43" gracePeriod=600 Dec 02 20:03:24 crc kubenswrapper[4878]: I1202 20:03:24.551673 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="f659bcb967b3ab0df4defdcfba9bdeebb874762a37661bdd5221e1047bf9dd43" exitCode=0 Dec 02 20:03:24 crc kubenswrapper[4878]: I1202 20:03:24.551754 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"f659bcb967b3ab0df4defdcfba9bdeebb874762a37661bdd5221e1047bf9dd43"} Dec 02 20:03:24 crc kubenswrapper[4878]: I1202 20:03:24.552206 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798"} Dec 02 20:03:24 crc kubenswrapper[4878]: I1202 20:03:24.552253 4878 scope.go:117] "RemoveContainer" containerID="305830008e399872e43e36343e405c029b244a37076b46000d776be6fa81ca25" Dec 02 20:03:30 crc kubenswrapper[4878]: I1202 20:03:30.624375 4878 generic.go:334] "Generic (PLEG): container finished" podID="be9a5cd4-71e2-4db9-9301-9d02ab4afa86" containerID="e14b69342559563b96fb45fa41b6f8b0357962f2b7c8c6558ef8fa174432fec8" exitCode=0 Dec 02 20:03:30 crc kubenswrapper[4878]: I1202 20:03:30.624503 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" event={"ID":"be9a5cd4-71e2-4db9-9301-9d02ab4afa86","Type":"ContainerDied","Data":"e14b69342559563b96fb45fa41b6f8b0357962f2b7c8c6558ef8fa174432fec8"} Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.794099 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.842283 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-fxj6f"] Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.863882 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-fxj6f"] Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.904213 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-host\") pod \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.904588 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76ltl\" (UniqueName: \"kubernetes.io/projected/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-kube-api-access-76ltl\") pod \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\" (UID: \"be9a5cd4-71e2-4db9-9301-9d02ab4afa86\") " Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.904822 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-host" (OuterVolumeSpecName: "host") pod "be9a5cd4-71e2-4db9-9301-9d02ab4afa86" (UID: "be9a5cd4-71e2-4db9-9301-9d02ab4afa86"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.905383 4878 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-host\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:31 crc kubenswrapper[4878]: I1202 20:03:31.916470 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-kube-api-access-76ltl" (OuterVolumeSpecName: "kube-api-access-76ltl") pod "be9a5cd4-71e2-4db9-9301-9d02ab4afa86" (UID: "be9a5cd4-71e2-4db9-9301-9d02ab4afa86"). InnerVolumeSpecName "kube-api-access-76ltl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:03:32 crc kubenswrapper[4878]: I1202 20:03:32.007535 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76ltl\" (UniqueName: \"kubernetes.io/projected/be9a5cd4-71e2-4db9-9301-9d02ab4afa86-kube-api-access-76ltl\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:32 crc kubenswrapper[4878]: I1202 20:03:32.651509 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="353acd2ba2e16ba28a3e634d7fe87b11fd58da5d56cee109a09031c5c0146c0c" Dec 02 20:03:32 crc kubenswrapper[4878]: I1202 20:03:32.651613 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-fxj6f" Dec 02 20:03:32 crc kubenswrapper[4878]: I1202 20:03:32.953555 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be9a5cd4-71e2-4db9-9301-9d02ab4afa86" path="/var/lib/kubelet/pods/be9a5cd4-71e2-4db9-9301-9d02ab4afa86/volumes" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.229883 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-llm2j"] Dec 02 20:03:33 crc kubenswrapper[4878]: E1202 20:03:33.230659 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9a5cd4-71e2-4db9-9301-9d02ab4afa86" containerName="container-00" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.230759 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9a5cd4-71e2-4db9-9301-9d02ab4afa86" containerName="container-00" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.231089 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9a5cd4-71e2-4db9-9301-9d02ab4afa86" containerName="container-00" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.232015 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.238682 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8sbrm"/"default-dockercfg-bm7gc" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.337301 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j82r\" (UniqueName: \"kubernetes.io/projected/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-kube-api-access-7j82r\") pod \"crc-debug-llm2j\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.337723 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-host\") pod \"crc-debug-llm2j\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.440312 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j82r\" (UniqueName: \"kubernetes.io/projected/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-kube-api-access-7j82r\") pod \"crc-debug-llm2j\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.440658 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-host\") pod \"crc-debug-llm2j\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.440788 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-host\") pod \"crc-debug-llm2j\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.459695 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j82r\" (UniqueName: \"kubernetes.io/projected/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-kube-api-access-7j82r\") pod \"crc-debug-llm2j\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.553981 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:33 crc kubenswrapper[4878]: W1202 20:03:33.598972 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53de4204_a3c7_4c58_ab65_6cf6f641e7fe.slice/crio-02aeefa0f3b9deb1142f82214e1961277e5c0c7d4702845ecc8c6bb712675cc5 WatchSource:0}: Error finding container 02aeefa0f3b9deb1142f82214e1961277e5c0c7d4702845ecc8c6bb712675cc5: Status 404 returned error can't find the container with id 02aeefa0f3b9deb1142f82214e1961277e5c0c7d4702845ecc8c6bb712675cc5 Dec 02 20:03:33 crc kubenswrapper[4878]: I1202 20:03:33.683457 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-llm2j" event={"ID":"53de4204-a3c7-4c58-ab65-6cf6f641e7fe","Type":"ContainerStarted","Data":"02aeefa0f3b9deb1142f82214e1961277e5c0c7d4702845ecc8c6bb712675cc5"} Dec 02 20:03:34 crc kubenswrapper[4878]: I1202 20:03:34.698528 4878 generic.go:334] "Generic (PLEG): container finished" podID="53de4204-a3c7-4c58-ab65-6cf6f641e7fe" containerID="ee1c282ca644bfd8eb01c1f50238086b1e4b7e7009ee14657278192b9764389b" exitCode=0 Dec 02 20:03:34 crc kubenswrapper[4878]: I1202 20:03:34.698626 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-llm2j" event={"ID":"53de4204-a3c7-4c58-ab65-6cf6f641e7fe","Type":"ContainerDied","Data":"ee1c282ca644bfd8eb01c1f50238086b1e4b7e7009ee14657278192b9764389b"} Dec 02 20:03:35 crc kubenswrapper[4878]: I1202 20:03:35.835816 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:35 crc kubenswrapper[4878]: I1202 20:03:35.913907 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-host\") pod \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " Dec 02 20:03:35 crc kubenswrapper[4878]: I1202 20:03:35.913952 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j82r\" (UniqueName: \"kubernetes.io/projected/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-kube-api-access-7j82r\") pod \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\" (UID: \"53de4204-a3c7-4c58-ab65-6cf6f641e7fe\") " Dec 02 20:03:35 crc kubenswrapper[4878]: I1202 20:03:35.914094 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-host" (OuterVolumeSpecName: "host") pod "53de4204-a3c7-4c58-ab65-6cf6f641e7fe" (UID: "53de4204-a3c7-4c58-ab65-6cf6f641e7fe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:03:35 crc kubenswrapper[4878]: I1202 20:03:35.914462 4878 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-host\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:35 crc kubenswrapper[4878]: I1202 20:03:35.919637 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-kube-api-access-7j82r" (OuterVolumeSpecName: "kube-api-access-7j82r") pod "53de4204-a3c7-4c58-ab65-6cf6f641e7fe" (UID: "53de4204-a3c7-4c58-ab65-6cf6f641e7fe"). InnerVolumeSpecName "kube-api-access-7j82r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:03:36 crc kubenswrapper[4878]: I1202 20:03:36.016021 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j82r\" (UniqueName: \"kubernetes.io/projected/53de4204-a3c7-4c58-ab65-6cf6f641e7fe-kube-api-access-7j82r\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:36 crc kubenswrapper[4878]: I1202 20:03:36.723676 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-llm2j" event={"ID":"53de4204-a3c7-4c58-ab65-6cf6f641e7fe","Type":"ContainerDied","Data":"02aeefa0f3b9deb1142f82214e1961277e5c0c7d4702845ecc8c6bb712675cc5"} Dec 02 20:03:36 crc kubenswrapper[4878]: I1202 20:03:36.723727 4878 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02aeefa0f3b9deb1142f82214e1961277e5c0c7d4702845ecc8c6bb712675cc5" Dec 02 20:03:36 crc kubenswrapper[4878]: I1202 20:03:36.723740 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-llm2j" Dec 02 20:03:37 crc kubenswrapper[4878]: I1202 20:03:37.092314 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-llm2j"] Dec 02 20:03:37 crc kubenswrapper[4878]: I1202 20:03:37.106018 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-llm2j"] Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.288254 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-zj6rm"] Dec 02 20:03:38 crc kubenswrapper[4878]: E1202 20:03:38.289690 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53de4204-a3c7-4c58-ab65-6cf6f641e7fe" containerName="container-00" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.289709 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="53de4204-a3c7-4c58-ab65-6cf6f641e7fe" containerName="container-00" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.290403 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="53de4204-a3c7-4c58-ab65-6cf6f641e7fe" containerName="container-00" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.292489 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.297005 4878 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8sbrm"/"default-dockercfg-bm7gc" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.376795 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwgzz\" (UniqueName: \"kubernetes.io/projected/a49f90b3-ff52-445e-820c-37988aa9ed9b-kube-api-access-gwgzz\") pod \"crc-debug-zj6rm\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.376875 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a49f90b3-ff52-445e-820c-37988aa9ed9b-host\") pod \"crc-debug-zj6rm\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.479529 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwgzz\" (UniqueName: \"kubernetes.io/projected/a49f90b3-ff52-445e-820c-37988aa9ed9b-kube-api-access-gwgzz\") pod \"crc-debug-zj6rm\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.479605 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a49f90b3-ff52-445e-820c-37988aa9ed9b-host\") pod \"crc-debug-zj6rm\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.479774 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a49f90b3-ff52-445e-820c-37988aa9ed9b-host\") pod \"crc-debug-zj6rm\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.498397 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwgzz\" (UniqueName: \"kubernetes.io/projected/a49f90b3-ff52-445e-820c-37988aa9ed9b-kube-api-access-gwgzz\") pod \"crc-debug-zj6rm\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.624740 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.749804 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" event={"ID":"a49f90b3-ff52-445e-820c-37988aa9ed9b","Type":"ContainerStarted","Data":"1d5804c27b07260d3e6a640dcfc64fedad9b93ff5cac08c559d932a12896fec9"} Dec 02 20:03:38 crc kubenswrapper[4878]: I1202 20:03:38.953563 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53de4204-a3c7-4c58-ab65-6cf6f641e7fe" path="/var/lib/kubelet/pods/53de4204-a3c7-4c58-ab65-6cf6f641e7fe/volumes" Dec 02 20:03:39 crc kubenswrapper[4878]: I1202 20:03:39.760865 4878 generic.go:334] "Generic (PLEG): container finished" podID="a49f90b3-ff52-445e-820c-37988aa9ed9b" containerID="5d4a6c0134b61bb4e5fa851903fad0478b212ea88d49a64bae6bbe90a1ea358b" exitCode=0 Dec 02 20:03:39 crc kubenswrapper[4878]: I1202 20:03:39.760916 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" event={"ID":"a49f90b3-ff52-445e-820c-37988aa9ed9b","Type":"ContainerDied","Data":"5d4a6c0134b61bb4e5fa851903fad0478b212ea88d49a64bae6bbe90a1ea358b"} Dec 02 20:03:39 crc kubenswrapper[4878]: I1202 20:03:39.797886 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-zj6rm"] Dec 02 20:03:39 crc kubenswrapper[4878]: I1202 20:03:39.814598 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8sbrm/crc-debug-zj6rm"] Dec 02 20:03:40 crc kubenswrapper[4878]: I1202 20:03:40.917858 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.035832 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a49f90b3-ff52-445e-820c-37988aa9ed9b-host\") pod \"a49f90b3-ff52-445e-820c-37988aa9ed9b\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.035899 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwgzz\" (UniqueName: \"kubernetes.io/projected/a49f90b3-ff52-445e-820c-37988aa9ed9b-kube-api-access-gwgzz\") pod \"a49f90b3-ff52-445e-820c-37988aa9ed9b\" (UID: \"a49f90b3-ff52-445e-820c-37988aa9ed9b\") " Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.035980 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a49f90b3-ff52-445e-820c-37988aa9ed9b-host" (OuterVolumeSpecName: "host") pod "a49f90b3-ff52-445e-820c-37988aa9ed9b" (UID: "a49f90b3-ff52-445e-820c-37988aa9ed9b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.036768 4878 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a49f90b3-ff52-445e-820c-37988aa9ed9b-host\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.041880 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49f90b3-ff52-445e-820c-37988aa9ed9b-kube-api-access-gwgzz" (OuterVolumeSpecName: "kube-api-access-gwgzz") pod "a49f90b3-ff52-445e-820c-37988aa9ed9b" (UID: "a49f90b3-ff52-445e-820c-37988aa9ed9b"). InnerVolumeSpecName "kube-api-access-gwgzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.140352 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwgzz\" (UniqueName: \"kubernetes.io/projected/a49f90b3-ff52-445e-820c-37988aa9ed9b-kube-api-access-gwgzz\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.788865 4878 scope.go:117] "RemoveContainer" containerID="5d4a6c0134b61bb4e5fa851903fad0478b212ea88d49a64bae6bbe90a1ea358b" Dec 02 20:03:41 crc kubenswrapper[4878]: I1202 20:03:41.788928 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/crc-debug-zj6rm" Dec 02 20:03:42 crc kubenswrapper[4878]: I1202 20:03:42.951406 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49f90b3-ff52-445e-820c-37988aa9ed9b" path="/var/lib/kubelet/pods/a49f90b3-ff52-445e-820c-37988aa9ed9b/volumes" Dec 02 20:04:13 crc kubenswrapper[4878]: I1202 20:04:13.868756 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-api/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.020883 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-evaluator/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.091153 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-listener/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.158814 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_09c21be8-b654-42b3-b5da-59d1afb0054b/aodh-notifier/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.231609 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58bf7d9584-2nldp_e21a97ae-d28a-4c3c-b669-bd186e06a311/barbican-api/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.310372 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58bf7d9584-2nldp_e21a97ae-d28a-4c3c-b669-bd186e06a311/barbican-api-log/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.541839 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69bfdb774b-284fr_e4828d24-fa12-4cf6-9e5b-8864d62c8536/barbican-keystone-listener/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.800560 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-69bfdb774b-284fr_e4828d24-fa12-4cf6-9e5b-8864d62c8536/barbican-keystone-listener-log/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.827022 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54c9cb88c9-bzlhb_c992f2a3-c18e-470d-b4dc-168a9dcd8528/barbican-worker/0.log" Dec 02 20:04:14 crc kubenswrapper[4878]: I1202 20:04:14.921654 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54c9cb88c9-bzlhb_c992f2a3-c18e-470d-b4dc-168a9dcd8528/barbican-worker-log/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.112815 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-76cvj_52939763-97c2-42f6-9aa4-56e99153e87f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.210802 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/ceilometer-central-agent/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.329497 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/ceilometer-notification-agent/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.419721 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/proxy-httpd/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.429786 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_28b4921f-5e67-4490-83fc-eef206c05083/sg-core/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.721693 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60d7f991-50e6-47fa-8b4b-137022c03671/cinder-api-log/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.725847 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_60d7f991-50e6-47fa-8b4b-137022c03671/cinder-api/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.824399 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5cea7d1e-f0d6-4a27-9840-1ce77743b26d/cinder-scheduler/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.953298 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cf87z_2f8b1f89-0eef-426b-8bb0-8700c42ede2e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:15 crc kubenswrapper[4878]: I1202 20:04:15.997019 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5cea7d1e-f0d6-4a27-9840-1ce77743b26d/probe/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.175048 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-t7g94_8714a5c4-b7c2-4e8a-a112-2530648da63b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.230308 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d594564dc-vbhmw_66af733a-2f8b-4127-9c9c-00d137d8eb4e/init/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.477564 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d594564dc-vbhmw_66af733a-2f8b-4127-9c9c-00d137d8eb4e/init/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.507820 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-l59tj_7b2e71d7-3e0b-4bfa-835c-374dcf03dd86/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.566620 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d594564dc-vbhmw_66af733a-2f8b-4127-9c9c-00d137d8eb4e/dnsmasq-dns/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.748738 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dc42b137-3b4d-4673-8f42-e1fd55534c16/glance-log/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.757195 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dc42b137-3b4d-4673-8f42-e1fd55534c16/glance-httpd/0.log" Dec 02 20:04:16 crc kubenswrapper[4878]: I1202 20:04:16.968514 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_59f6212e-501f-4a58-8d24-8f79f95dc992/glance-log/0.log" Dec 02 20:04:17 crc kubenswrapper[4878]: I1202 20:04:17.036455 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_59f6212e-501f-4a58-8d24-8f79f95dc992/glance-httpd/0.log" Dec 02 20:04:17 crc kubenswrapper[4878]: I1202 20:04:17.535354 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-f8bcd7564-vh2kf_88be3860-e9da-4f2b-baff-142f994127c4/heat-api/0.log" Dec 02 20:04:17 crc kubenswrapper[4878]: I1202 20:04:17.906310 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-757798f686-n6x4p_a845b7d1-1f82-4048-8bc9-56611020bcec/heat-engine/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.038421 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-hs9dj_7ca1da86-eacb-4ac4-a155-62de0292cbdf/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.133214 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jh9ts_0112601b-e2a2-4547-bea0-5afad959f726/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.183520 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7546bcc475-2zz2q_a37933a9-8ddf-406e-9f40-b79fba21d5b5/heat-cfnapi/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.403101 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411701-sf57f_08153e97-46ad-4405-b0ea-7f4606a82c6f/keystone-cron/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.659875 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_763df430-6f7f-4642-9452-1fcc5d47d283/kube-state-metrics/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.679850 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411761-jk55l_803db126-60e5-4c26-9411-fe06a9b4c9cd/keystone-cron/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.757905 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-67bf9d8f54-s7vnk_917539f3-4a78-4c46-a2e3-0b95342fe994/keystone-api/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.885061 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-x2qm5_0b199858-7108-4b94-b3f9-692a11430c94/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:18 crc kubenswrapper[4878]: I1202 20:04:18.974869 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-vqdv9_a4fe74e0-9248-4fd9-8f5d-a82c1b8f72b6/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:19 crc kubenswrapper[4878]: I1202 20:04:19.211738 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_1bbb9528-81ba-487f-bf86-44276e8ac969/mysqld-exporter/0.log" Dec 02 20:04:19 crc kubenswrapper[4878]: I1202 20:04:19.601080 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9d785686f-gqxnz_70e70d85-cdcd-43e7-b2c2-dbc6386665e3/neutron-api/0.log" Dec 02 20:04:19 crc kubenswrapper[4878]: I1202 20:04:19.633582 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bdp6v_14ba7638-79c8-4806-9bc5-8b8c7ab029c3/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:19 crc kubenswrapper[4878]: I1202 20:04:19.731419 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9d785686f-gqxnz_70e70d85-cdcd-43e7-b2c2-dbc6386665e3/neutron-httpd/0.log" Dec 02 20:04:20 crc kubenswrapper[4878]: I1202 20:04:20.403661 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b5108475-8a24-4bce-a285-f4a26785d6f9/nova-cell0-conductor-conductor/0.log" Dec 02 20:04:20 crc kubenswrapper[4878]: I1202 20:04:20.621631 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4cfd0dfa-637d-432b-946d-753c5afa72dd/nova-api-log/0.log" Dec 02 20:04:20 crc kubenswrapper[4878]: I1202 20:04:20.650320 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_49118006-cedb-4f41-a752-c635108e2bf7/nova-cell1-conductor-conductor/0.log" Dec 02 20:04:20 crc kubenswrapper[4878]: I1202 20:04:20.996989 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-csjn8_53bb65b6-2ee5-42dd-8a1d-df8a04008975/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:21 crc kubenswrapper[4878]: I1202 20:04:21.031556 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_855f3d94-a64e-4661-8a2d-30b33a682633/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 20:04:21 crc kubenswrapper[4878]: I1202 20:04:21.247719 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4cfd0dfa-637d-432b-946d-753c5afa72dd/nova-api-api/0.log" Dec 02 20:04:21 crc kubenswrapper[4878]: I1202 20:04:21.260689 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5217962f-8411-4be7-bbd0-93858938b746/nova-metadata-log/0.log" Dec 02 20:04:21 crc kubenswrapper[4878]: I1202 20:04:21.709522 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9cae31f5-acb4-423b-8a14-4136afb73062/mysql-bootstrap/0.log" Dec 02 20:04:21 crc kubenswrapper[4878]: I1202 20:04:21.716951 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5dc0fa21-804e-42bf-a190-4e108c84df48/nova-scheduler-scheduler/0.log" Dec 02 20:04:21 crc kubenswrapper[4878]: I1202 20:04:21.826856 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9cae31f5-acb4-423b-8a14-4136afb73062/mysql-bootstrap/0.log" Dec 02 20:04:21 crc kubenswrapper[4878]: I1202 20:04:21.950911 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9cae31f5-acb4-423b-8a14-4136afb73062/galera/0.log" Dec 02 20:04:22 crc kubenswrapper[4878]: I1202 20:04:22.056791 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c436c198-1049-416f-9ab7-33261ff55ab4/mysql-bootstrap/0.log" Dec 02 20:04:22 crc kubenswrapper[4878]: I1202 20:04:22.256322 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c436c198-1049-416f-9ab7-33261ff55ab4/mysql-bootstrap/0.log" Dec 02 20:04:22 crc kubenswrapper[4878]: I1202 20:04:22.340736 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c436c198-1049-416f-9ab7-33261ff55ab4/galera/0.log" Dec 02 20:04:22 crc kubenswrapper[4878]: I1202 20:04:22.458787 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5caa14c8-5110-4246-a7d4-75ef3c6d5d00/openstackclient/0.log" Dec 02 20:04:22 crc kubenswrapper[4878]: I1202 20:04:22.593376 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-2pqpm_f833875c-c0f5-4654-b592-14d4a6161df6/openstack-network-exporter/0.log" Dec 02 20:04:22 crc kubenswrapper[4878]: I1202 20:04:22.792726 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovsdb-server-init/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.007820 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovsdb-server-init/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.065376 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovsdb-server/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.073488 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsnc6_f1f92026-f0b1-470f-885e-914fece7f4e3/ovs-vswitchd/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.272049 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qgmlw_1f049ebe-547b-40a2-8468-932cfc5051ea/ovn-controller/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.523254 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sm6fm_49eb68a8-72ac-4fb7-ab11-7e89f85e7f22/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.579323 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af53983f-772c-431c-95a1-af6b3d3c0edf/openstack-network-exporter/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.772066 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af53983f-772c-431c-95a1-af6b3d3c0edf/ovn-northd/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.816825 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a6aad750-71cc-4815-906a-5f2a130875e8/openstack-network-exporter/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.833453 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5217962f-8411-4be7-bbd0-93858938b746/nova-metadata-metadata/0.log" Dec 02 20:04:23 crc kubenswrapper[4878]: I1202 20:04:23.979458 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a6aad750-71cc-4815-906a-5f2a130875e8/ovsdbserver-nb/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.253611 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4/openstack-network-exporter/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.380652 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3ac1cf4e-ce7b-4aca-99c7-93d4e12a0aa4/ovsdbserver-sb/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.609599 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd45c784-zz4hz_5d63745b-034f-4f6f-b2f7-abeca299930b/placement-api/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.669406 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd45c784-zz4hz_5d63745b-034f-4f6f-b2f7-abeca299930b/placement-log/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.686720 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/init-config-reloader/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.888299 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/init-config-reloader/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.901614 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/prometheus/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.930294 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/thanos-sidecar/0.log" Dec 02 20:04:24 crc kubenswrapper[4878]: I1202 20:04:24.959420 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_80be1dfe-9477-4d5c-9b11-2664d4300eca/config-reloader/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.065955 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eedb789b-6bed-4a82-82c1-977a633ed304/setup-container/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.326456 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eedb789b-6bed-4a82-82c1-977a633ed304/setup-container/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.406170 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eedb789b-6bed-4a82-82c1-977a633ed304/rabbitmq/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.433551 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da/setup-container/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.657432 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da/setup-container/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.727097 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3af9c36d-d2f2-4d8e-ae9a-54fd44ca47da/rabbitmq/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.739460 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pq97n_f1a288af-a20b-4e48-a331-561e16e01989/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:25 crc kubenswrapper[4878]: I1202 20:04:25.997169 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zgm9n_4c58063d-9ff4-43dd-9dec-17d14a541013/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:26 crc kubenswrapper[4878]: I1202 20:04:26.029890 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dbhx8_8097a01b-4fab-4bac-839d-a1f937120beb/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:26 crc kubenswrapper[4878]: I1202 20:04:26.237172 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t4tn6_8b071f6b-2ee2-41ea-ab08-219cf6ecfc0b/ssh-known-hosts-edpm-deployment/0.log" Dec 02 20:04:26 crc kubenswrapper[4878]: I1202 20:04:26.263306 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zstf2_e32e5051-b0ff-4fee-9268-266c4cc38c68/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:26 crc kubenswrapper[4878]: I1202 20:04:26.590008 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54fd7cfcc9-x4n56_c033144d-0cad-47bd-87b6-3715278cf5c1/proxy-server/0.log" Dec 02 20:04:26 crc kubenswrapper[4878]: I1202 20:04:26.858104 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rfcql_0e84c0f0-1e32-4c9b-b21d-f49bb06863fc/swift-ring-rebalance/0.log" Dec 02 20:04:26 crc kubenswrapper[4878]: I1202 20:04:26.868204 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54fd7cfcc9-x4n56_c033144d-0cad-47bd-87b6-3715278cf5c1/proxy-httpd/0.log" Dec 02 20:04:26 crc kubenswrapper[4878]: I1202 20:04:26.871771 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-auditor/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.118388 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-server/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.128539 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-reaper/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.147501 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-auditor/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.162300 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/account-replicator/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.369324 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-server/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.377579 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-replicator/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.387588 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/container-updater/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.436179 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-auditor/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.551980 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-expirer/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.600678 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-server/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.637777 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-replicator/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.690635 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/object-updater/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.839288 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/swift-recon-cron/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.863464 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5c2a6cb2-97ad-418b-9132-1d17e3cd9c7f/rsync/0.log" Dec 02 20:04:27 crc kubenswrapper[4878]: I1202 20:04:27.979802 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-cvwgq_1d2ce001-5523-44c3-b911-47c3f44ffb77/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:28 crc kubenswrapper[4878]: I1202 20:04:28.196133 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-wvhxt_58ff5ba7-481c-48d8-bf39-0eb6665f23d7/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:28 crc kubenswrapper[4878]: I1202 20:04:28.394707 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_18ccf713-5eef-4b4b-b1ea-9f3b34639edb/test-operator-logs-container/0.log" Dec 02 20:04:28 crc kubenswrapper[4878]: I1202 20:04:28.645515 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-46pqj_ad5d3ca4-7255-4bd3-9976-0834ea7b94ee/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 20:04:29 crc kubenswrapper[4878]: I1202 20:04:29.180321 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e5a4096d-c2be-4987-b0eb-fb47da8a9703/tempest-tests-tempest-tests-runner/0.log" Dec 02 20:04:31 crc kubenswrapper[4878]: I1202 20:04:31.095792 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_427bac6f-5bf8-4f40-a0f6-fea0cede315f/memcached/0.log" Dec 02 20:04:54 crc kubenswrapper[4878]: I1202 20:04:54.813518 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/util/0.log" Dec 02 20:04:54 crc kubenswrapper[4878]: I1202 20:04:54.925817 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/pull/0.log" Dec 02 20:04:54 crc kubenswrapper[4878]: I1202 20:04:54.944767 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/util/0.log" Dec 02 20:04:54 crc kubenswrapper[4878]: I1202 20:04:54.978015 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/pull/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.140354 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/pull/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.160254 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/util/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.194824 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a23582232292bbb143e963a95befbcb28366057d63d5ef8f216bd2f512c5dpf_36734ca4-54d4-4d24-a3ba-bbd876acbe2e/extract/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.392997 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgj5c_81eba8a0-84f6-4456-9484-dfa84dda8e10/kube-rbac-proxy/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.433190 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rgj5c_81eba8a0-84f6-4456-9484-dfa84dda8e10/manager/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.449922 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-ds92l_0e516f0b-2b62-4d60-b1bd-07404ffcdea9/kube-rbac-proxy/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.589881 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-ds92l_0e516f0b-2b62-4d60-b1bd-07404ffcdea9/manager/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.638166 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-kwqck_77e2f2de-8d3f-437b-8f32-7b76ea70ccda/kube-rbac-proxy/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.666410 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-kwqck_77e2f2de-8d3f-437b-8f32-7b76ea70ccda/manager/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.872843 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4gsxd_fc38a188-1850-41eb-a958-fd1fe01270c7/manager/0.log" Dec 02 20:04:55 crc kubenswrapper[4878]: I1202 20:04:55.916835 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-4gsxd_fc38a188-1850-41eb-a958-fd1fe01270c7/kube-rbac-proxy/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.102139 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-fx64f_a128e6b1-604f-4d2d-9b31-1567ade115df/manager/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.116400 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-fx64f_a128e6b1-604f-4d2d-9b31-1567ade115df/kube-rbac-proxy/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.204992 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vc67f_d7143587-e348-48b5-9164-a4d477b4a259/kube-rbac-proxy/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.294162 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vc67f_d7143587-e348-48b5-9164-a4d477b4a259/manager/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.333196 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-cmzpg_3028ad1d-cba5-4197-964f-6405fb1cc1c3/kube-rbac-proxy/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.535416 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-rqzqn_7dfef70a-0da9-4ad6-9fda-1cac674c9ddb/kube-rbac-proxy/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.570098 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-cmzpg_3028ad1d-cba5-4197-964f-6405fb1cc1c3/manager/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.616398 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-rqzqn_7dfef70a-0da9-4ad6-9fda-1cac674c9ddb/manager/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.765263 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-kmkzd_e51c89b2-70ae-4c1d-81b0-8aba6e211dd0/kube-rbac-proxy/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.882339 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-kmkzd_e51c89b2-70ae-4c1d-81b0-8aba6e211dd0/manager/0.log" Dec 02 20:04:56 crc kubenswrapper[4878]: I1202 20:04:56.916028 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4c7fq_5cee25c6-1e94-400c-afd8-c1e75f31e619/kube-rbac-proxy/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.004655 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-4c7fq_5cee25c6-1e94-400c-afd8-c1e75f31e619/manager/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.078514 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-952z9_492dff42-bb87-4c30-8f81-02406308904c/kube-rbac-proxy/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.150135 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-952z9_492dff42-bb87-4c30-8f81-02406308904c/manager/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.284539 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-f58x6_ed599489-c1f6-440f-aaf3-339f424cbcdf/kube-rbac-proxy/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.314494 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-f58x6_ed599489-c1f6-440f-aaf3-339f424cbcdf/manager/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.476327 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xzpbx_84d9833f-9760-47ea-ba43-f385b24a3e57/kube-rbac-proxy/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.538845 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-xzpbx_84d9833f-9760-47ea-ba43-f385b24a3e57/manager/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.605811 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9stj8_46cfeb81-3e48-4f16-ae55-aabe49810afb/kube-rbac-proxy/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.668633 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9stj8_46cfeb81-3e48-4f16-ae55-aabe49810afb/manager/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.841958 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk_095b515f-0144-4f7b-b3ab-9ca3440921db/kube-rbac-proxy/0.log" Dec 02 20:04:57 crc kubenswrapper[4878]: I1202 20:04:57.856086 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4k9gtk_095b515f-0144-4f7b-b3ab-9ca3440921db/manager/0.log" Dec 02 20:04:58 crc kubenswrapper[4878]: I1202 20:04:58.275442 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6db445db9f-m4wkz_ce506db4-fc2d-45a6-b9c1-23d22cc536cc/operator/0.log" Dec 02 20:04:58 crc kubenswrapper[4878]: I1202 20:04:58.400212 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dfbvm_73939207-5e4c-4ef0-ba00-efa6b403e4c7/registry-server/0.log" Dec 02 20:04:58 crc kubenswrapper[4878]: I1202 20:04:58.515528 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-w99w9_56f1dd36-8cc9-4026-8976-8816940217a4/kube-rbac-proxy/0.log" Dec 02 20:04:58 crc kubenswrapper[4878]: I1202 20:04:58.725383 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-w99w9_56f1dd36-8cc9-4026-8976-8816940217a4/manager/0.log" Dec 02 20:04:58 crc kubenswrapper[4878]: I1202 20:04:58.752316 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-6qznx_98571e6d-dae6-4e83-8d08-e44e8609188f/kube-rbac-proxy/0.log" Dec 02 20:04:58 crc kubenswrapper[4878]: I1202 20:04:58.982012 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-6qznx_98571e6d-dae6-4e83-8d08-e44e8609188f/manager/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.055984 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-24n6c_5691d664-31ad-44c8-ab51-11bcf8f9d4c2/operator/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.208734 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j4dz5_5a683e44-012a-41ec-98db-36bcd5646959/manager/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.246224 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-j4dz5_5a683e44-012a-41ec-98db-36bcd5646959/kube-rbac-proxy/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.310433 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-546f978c55-mvlfw_44a363d4-d9a3-44df-8a8f-902cb14a0443/kube-rbac-proxy/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.499673 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fhrfj_572eebdd-2dc7-4327-a339-4f92e3971d59/kube-rbac-proxy/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.540069 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fhrfj_572eebdd-2dc7-4327-a339-4f92e3971d59/manager/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.576343 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-bb64db99c-xtzmk_e18472ba-dc06-4e34-99a7-974d9af72c0a/manager/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.797628 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-546f978c55-mvlfw_44a363d4-d9a3-44df-8a8f-902cb14a0443/manager/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.822894 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-hndnh_bfa1dc17-f042-46f8-8bc8-3f8d9e135073/kube-rbac-proxy/0.log" Dec 02 20:04:59 crc kubenswrapper[4878]: I1202 20:04:59.831689 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-hndnh_bfa1dc17-f042-46f8-8bc8-3f8d9e135073/manager/0.log" Dec 02 20:05:20 crc kubenswrapper[4878]: I1202 20:05:20.446042 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-b6fbw_a8e39ea4-f7e1-4df7-997b-2c3923f9ad6b/control-plane-machine-set-operator/0.log" Dec 02 20:05:20 crc kubenswrapper[4878]: I1202 20:05:20.617990 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ngj62_ea820fcf-7d34-4381-bafa-cbc53d3f7c86/kube-rbac-proxy/0.log" Dec 02 20:05:20 crc kubenswrapper[4878]: I1202 20:05:20.672541 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ngj62_ea820fcf-7d34-4381-bafa-cbc53d3f7c86/machine-api-operator/0.log" Dec 02 20:05:35 crc kubenswrapper[4878]: I1202 20:05:35.606305 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7mgrc_4311fcfc-1cf4-4bab-b946-40efef5b8c10/cert-manager-controller/0.log" Dec 02 20:05:35 crc kubenswrapper[4878]: I1202 20:05:35.836936 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xhjz5_c5196963-9d92-4f0a-ab8c-47f4b86a685f/cert-manager-cainjector/0.log" Dec 02 20:05:35 crc kubenswrapper[4878]: I1202 20:05:35.939811 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-qfhjc_bb76342f-0435-4589-826f-3a7cee8cc419/cert-manager-webhook/0.log" Dec 02 20:05:50 crc kubenswrapper[4878]: I1202 20:05:50.513832 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-88jcq_91633707-7b72-4b18-a516-a6b327dc44f1/nmstate-console-plugin/0.log" Dec 02 20:05:50 crc kubenswrapper[4878]: I1202 20:05:50.671375 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-v45r2_5a7dc067-d2fd-4145-b79c-33fac3675cdd/nmstate-handler/0.log" Dec 02 20:05:50 crc kubenswrapper[4878]: I1202 20:05:50.759801 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mzs4g_788e4f72-26a7-455f-b805-32b1b519726c/kube-rbac-proxy/0.log" Dec 02 20:05:50 crc kubenswrapper[4878]: I1202 20:05:50.763654 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mzs4g_788e4f72-26a7-455f-b805-32b1b519726c/nmstate-metrics/0.log" Dec 02 20:05:50 crc kubenswrapper[4878]: I1202 20:05:50.967953 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-h4wxh_b86218c7-3d62-4631-8f95-e70b1f304615/nmstate-webhook/0.log" Dec 02 20:05:50 crc kubenswrapper[4878]: I1202 20:05:50.978254 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-67sqq_e9a04710-7320-4cde-9863-e05e65f54671/nmstate-operator/0.log" Dec 02 20:05:53 crc kubenswrapper[4878]: I1202 20:05:53.742297 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:05:53 crc kubenswrapper[4878]: I1202 20:05:53.742607 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:06:05 crc kubenswrapper[4878]: I1202 20:06:05.867281 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/kube-rbac-proxy/0.log" Dec 02 20:06:05 crc kubenswrapper[4878]: I1202 20:06:05.922719 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/manager/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.221884 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-wpfmg_056447cf-55f2-4ddd-bf36-7f0f637f5ca5/cluster-logging-operator/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.439859 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-jjcjt_089919f9-be76-4956-a7d3-92fa66aa19ef/collector/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.496596 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_3abc051e-1bf7-4ac4-aa57-cb804ed0ff9a/loki-compactor/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.646290 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-tl222_7f86bb98-b2df-4776-97a4-7a45b69972b8/loki-distributor/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.695021 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-89mkg_6bb468bd-bb35-4a4e-b41c-ed2a8f964d77/gateway/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.747760 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-89mkg_6bb468bd-bb35-4a4e-b41c-ed2a8f964d77/opa/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.882402 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-zjfmm_92b9d963-d156-45ca-89fd-3f992d10d24e/gateway/0.log" Dec 02 20:06:22 crc kubenswrapper[4878]: I1202 20:06:22.969774 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-85bc84b7b8-zjfmm_92b9d963-d156-45ca-89fd-3f992d10d24e/opa/0.log" Dec 02 20:06:23 crc kubenswrapper[4878]: I1202 20:06:23.036939 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_ef6cf23f-200e-43b7-81ea-b13382391ad0/loki-index-gateway/0.log" Dec 02 20:06:23 crc kubenswrapper[4878]: I1202 20:06:23.249909 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-2d5dl_b1de2d7e-c37c-4464-bd35-337650bd62bf/loki-querier/0.log" Dec 02 20:06:23 crc kubenswrapper[4878]: I1202 20:06:23.283538 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_1a9f06ce-f976-42a6-9393-55ac1c7ca894/loki-ingester/0.log" Dec 02 20:06:23 crc kubenswrapper[4878]: I1202 20:06:23.348931 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-xwgzp_316a18ce-1717-4e23-8750-17b4ec2e553c/loki-query-frontend/0.log" Dec 02 20:06:23 crc kubenswrapper[4878]: I1202 20:06:23.741987 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:06:23 crc kubenswrapper[4878]: I1202 20:06:23.742393 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:06:39 crc kubenswrapper[4878]: I1202 20:06:39.732543 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jmg7l_f49a3efc-d73c-4b26-b668-8abf574eb6a9/kube-rbac-proxy/0.log" Dec 02 20:06:39 crc kubenswrapper[4878]: I1202 20:06:39.823680 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jmg7l_f49a3efc-d73c-4b26-b668-8abf574eb6a9/controller/0.log" Dec 02 20:06:39 crc kubenswrapper[4878]: I1202 20:06:39.918436 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.096941 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.138497 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.166370 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.168649 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.358120 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.365828 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.393787 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.417629 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.556848 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-frr-files/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.558306 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-reloader/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.597860 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/cp-metrics/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.617336 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/controller/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.771811 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/frr-metrics/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.791781 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/kube-rbac-proxy/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.819421 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/kube-rbac-proxy-frr/0.log" Dec 02 20:06:40 crc kubenswrapper[4878]: I1202 20:06:40.974309 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/reloader/0.log" Dec 02 20:06:41 crc kubenswrapper[4878]: I1202 20:06:41.044095 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-p7f4t_bb4be508-3a9c-48d3-a995-124bf91a4128/frr-k8s-webhook-server/0.log" Dec 02 20:06:41 crc kubenswrapper[4878]: I1202 20:06:41.228116 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b867f79f6-k2zm6_cda42159-8d1b-460a-b92e-a02db29c88e9/manager/0.log" Dec 02 20:06:41 crc kubenswrapper[4878]: I1202 20:06:41.477924 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-fd8d8f689-k4m6l_239083bd-7b88-46e8-b5e6-b1fdb9abc032/webhook-server/0.log" Dec 02 20:06:41 crc kubenswrapper[4878]: I1202 20:06:41.534032 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q4nt_73bda1d1-d063-44f4-8b13-20af22c61540/kube-rbac-proxy/0.log" Dec 02 20:06:42 crc kubenswrapper[4878]: I1202 20:06:42.307609 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4q4nt_73bda1d1-d063-44f4-8b13-20af22c61540/speaker/0.log" Dec 02 20:06:42 crc kubenswrapper[4878]: I1202 20:06:42.641103 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2qjj9_d7581b57-41bb-4843-89cb-00b1966ccd8e/frr/0.log" Dec 02 20:06:53 crc kubenswrapper[4878]: I1202 20:06:53.742363 4878 patch_prober.go:28] interesting pod/machine-config-daemon-npvcg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:06:53 crc kubenswrapper[4878]: I1202 20:06:53.743063 4878 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:06:53 crc kubenswrapper[4878]: I1202 20:06:53.743137 4878 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" Dec 02 20:06:53 crc kubenswrapper[4878]: I1202 20:06:53.744391 4878 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798"} pod="openshift-machine-config-operator/machine-config-daemon-npvcg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:06:53 crc kubenswrapper[4878]: I1202 20:06:53.744472 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" containerName="machine-config-daemon" containerID="cri-o://826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" gracePeriod=600 Dec 02 20:06:53 crc kubenswrapper[4878]: E1202 20:06:53.880774 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:06:54 crc kubenswrapper[4878]: I1202 20:06:54.445579 4878 generic.go:334] "Generic (PLEG): container finished" podID="723bfeea-9234-4d2a-8492-747dc974d044" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" exitCode=0 Dec 02 20:06:54 crc kubenswrapper[4878]: I1202 20:06:54.445719 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerDied","Data":"826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798"} Dec 02 20:06:54 crc kubenswrapper[4878]: I1202 20:06:54.445953 4878 scope.go:117] "RemoveContainer" containerID="f659bcb967b3ab0df4defdcfba9bdeebb874762a37661bdd5221e1047bf9dd43" Dec 02 20:06:54 crc kubenswrapper[4878]: I1202 20:06:54.446863 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:06:54 crc kubenswrapper[4878]: E1202 20:06:54.447328 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:06:55 crc kubenswrapper[4878]: I1202 20:06:55.518326 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/util/0.log" Dec 02 20:06:55 crc kubenswrapper[4878]: I1202 20:06:55.759720 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/util/0.log" Dec 02 20:06:55 crc kubenswrapper[4878]: I1202 20:06:55.764153 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/pull/0.log" Dec 02 20:06:55 crc kubenswrapper[4878]: I1202 20:06:55.807978 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/pull/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.012190 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/util/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.013050 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/extract/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.064548 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8rknhn_bf6b244f-0cc7-4f5d-9522-9fe4e65897a6/pull/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.193108 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/util/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.374116 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/pull/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.375463 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/pull/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.408381 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/util/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.561678 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/pull/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.566103 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/util/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.582164 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftxpjs_2a985616-194f-418e-ba7c-ee6fe105df8c/extract/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.749327 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/util/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.920937 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/pull/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.936222 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/util/0.log" Dec 02 20:06:56 crc kubenswrapper[4878]: I1202 20:06:56.953566 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/pull/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.198086 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/extract/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.199561 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/util/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.206131 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210w47jb_28d45a6b-6f64-487d-bac3-60ce7a1321f7/pull/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.388886 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/util/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.574135 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/util/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.576837 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/pull/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.582332 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/pull/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.847892 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/util/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.849405 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/extract/0.log" Dec 02 20:06:57 crc kubenswrapper[4878]: I1202 20:06:57.886514 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f8pz4r_abd35f96-61cf-48b3-b66d-c3d54414bb74/pull/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.096022 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/util/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.281651 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/pull/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.285688 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/pull/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.292990 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/util/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.459349 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/extract/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.478135 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/util/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.519208 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83vm9bt_caa9ae90-89de-4467-a6e1-1043d45ed9e8/pull/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.587941 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ddsmw"] Dec 02 20:06:58 crc kubenswrapper[4878]: E1202 20:06:58.588483 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49f90b3-ff52-445e-820c-37988aa9ed9b" containerName="container-00" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.588496 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49f90b3-ff52-445e-820c-37988aa9ed9b" containerName="container-00" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.588947 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49f90b3-ff52-445e-820c-37988aa9ed9b" containerName="container-00" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.590848 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.599250 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ddsmw"] Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.702247 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdpdg\" (UniqueName: \"kubernetes.io/projected/19d6d212-d514-4b22-9609-acaf4a676ac2-kube-api-access-fdpdg\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.702620 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-catalog-content\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.702806 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-utilities\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.784477 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-utilities/0.log" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.804936 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdpdg\" (UniqueName: \"kubernetes.io/projected/19d6d212-d514-4b22-9609-acaf4a676ac2-kube-api-access-fdpdg\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.805284 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-catalog-content\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.805530 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-utilities\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.806189 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-catalog-content\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.806195 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-utilities\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.826973 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdpdg\" (UniqueName: \"kubernetes.io/projected/19d6d212-d514-4b22-9609-acaf4a676ac2-kube-api-access-fdpdg\") pod \"community-operators-ddsmw\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:58 crc kubenswrapper[4878]: I1202 20:06:58.962300 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.024452 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-utilities/0.log" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.028824 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-content/0.log" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.046044 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-content/0.log" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.395593 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-utilities/0.log" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.649789 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-utilities/0.log" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.762913 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/extract-content/0.log" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.928164 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ddsmw"] Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.957118 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-utilities/0.log" Dec 02 20:06:59 crc kubenswrapper[4878]: I1202 20:06:59.976879 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-content/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.003693 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-content/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.179830 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-utilities/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.254494 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/extract-content/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.444333 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8xkxg_11c10f5a-0137-467d-a749-1bce1c6210ed/marketplace-operator/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.456555 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxh8k_169b29b3-0fc5-499b-b05c-83469da6c269/registry-server/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.487622 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-utilities/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.521318 4878 generic.go:334] "Generic (PLEG): container finished" podID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerID="97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2" exitCode=0 Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.521364 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddsmw" event={"ID":"19d6d212-d514-4b22-9609-acaf4a676ac2","Type":"ContainerDied","Data":"97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2"} Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.521389 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddsmw" event={"ID":"19d6d212-d514-4b22-9609-acaf4a676ac2","Type":"ContainerStarted","Data":"58ffd5d4eaa59b0a04eb48abefdba202857a40dd5ddcdd2c32f2899a020af5f1"} Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.524645 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.655510 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-content/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.661308 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-utilities/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.714442 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-content/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.954888 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-utilities/0.log" Dec 02 20:07:00 crc kubenswrapper[4878]: I1202 20:07:00.971960 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/extract-content/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.148513 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-utilities/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.269184 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pq9p9_c55a4df1-0e5a-43aa-9de4-e216aeb40407/registry-server/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.301718 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-46kr4_2b1d265e-ec12-4052-83ec-5db3bd68b034/registry-server/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.365526 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-content/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.365759 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-content/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.414424 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-utilities/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.535734 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddsmw" event={"ID":"19d6d212-d514-4b22-9609-acaf4a676ac2","Type":"ContainerStarted","Data":"738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a"} Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.576864 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-utilities/0.log" Dec 02 20:07:01 crc kubenswrapper[4878]: I1202 20:07:01.612999 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/extract-content/0.log" Dec 02 20:07:02 crc kubenswrapper[4878]: I1202 20:07:02.401666 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-j4f2d_651270f6-1566-4359-b11c-561ee744e88f/registry-server/0.log" Dec 02 20:07:02 crc kubenswrapper[4878]: I1202 20:07:02.546859 4878 generic.go:334] "Generic (PLEG): container finished" podID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerID="738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a" exitCode=0 Dec 02 20:07:02 crc kubenswrapper[4878]: I1202 20:07:02.546902 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddsmw" event={"ID":"19d6d212-d514-4b22-9609-acaf4a676ac2","Type":"ContainerDied","Data":"738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a"} Dec 02 20:07:03 crc kubenswrapper[4878]: I1202 20:07:03.560116 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddsmw" event={"ID":"19d6d212-d514-4b22-9609-acaf4a676ac2","Type":"ContainerStarted","Data":"a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268"} Dec 02 20:07:03 crc kubenswrapper[4878]: I1202 20:07:03.592192 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ddsmw" podStartSLOduration=3.196127732 podStartE2EDuration="5.592147163s" podCreationTimestamp="2025-12-02 20:06:58 +0000 UTC" firstStartedPulling="2025-12-02 20:07:00.523435616 +0000 UTC m=+6730.213054497" lastFinishedPulling="2025-12-02 20:07:02.919455047 +0000 UTC m=+6732.609073928" observedRunningTime="2025-12-02 20:07:03.57891923 +0000 UTC m=+6733.268538121" watchObservedRunningTime="2025-12-02 20:07:03.592147163 +0000 UTC m=+6733.281766044" Dec 02 20:07:05 crc kubenswrapper[4878]: I1202 20:07:05.938562 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:07:05 crc kubenswrapper[4878]: E1202 20:07:05.939927 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:07:08 crc kubenswrapper[4878]: I1202 20:07:08.977265 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:07:08 crc kubenswrapper[4878]: I1202 20:07:08.979983 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:07:09 crc kubenswrapper[4878]: I1202 20:07:09.036659 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:07:09 crc kubenswrapper[4878]: I1202 20:07:09.702653 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:07:09 crc kubenswrapper[4878]: I1202 20:07:09.828723 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ddsmw"] Dec 02 20:07:11 crc kubenswrapper[4878]: I1202 20:07:11.661889 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ddsmw" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="registry-server" containerID="cri-o://a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268" gracePeriod=2 Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.222557 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.334657 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-utilities\") pod \"19d6d212-d514-4b22-9609-acaf4a676ac2\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.335079 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-catalog-content\") pod \"19d6d212-d514-4b22-9609-acaf4a676ac2\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.335435 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdpdg\" (UniqueName: \"kubernetes.io/projected/19d6d212-d514-4b22-9609-acaf4a676ac2-kube-api-access-fdpdg\") pod \"19d6d212-d514-4b22-9609-acaf4a676ac2\" (UID: \"19d6d212-d514-4b22-9609-acaf4a676ac2\") " Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.335603 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-utilities" (OuterVolumeSpecName: "utilities") pod "19d6d212-d514-4b22-9609-acaf4a676ac2" (UID: "19d6d212-d514-4b22-9609-acaf4a676ac2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.336568 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.342251 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d6d212-d514-4b22-9609-acaf4a676ac2-kube-api-access-fdpdg" (OuterVolumeSpecName: "kube-api-access-fdpdg") pod "19d6d212-d514-4b22-9609-acaf4a676ac2" (UID: "19d6d212-d514-4b22-9609-acaf4a676ac2"). InnerVolumeSpecName "kube-api-access-fdpdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.401284 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19d6d212-d514-4b22-9609-acaf4a676ac2" (UID: "19d6d212-d514-4b22-9609-acaf4a676ac2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.439037 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d6d212-d514-4b22-9609-acaf4a676ac2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.439073 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdpdg\" (UniqueName: \"kubernetes.io/projected/19d6d212-d514-4b22-9609-acaf4a676ac2-kube-api-access-fdpdg\") on node \"crc\" DevicePath \"\"" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.677097 4878 generic.go:334] "Generic (PLEG): container finished" podID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerID="a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268" exitCode=0 Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.677176 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddsmw" event={"ID":"19d6d212-d514-4b22-9609-acaf4a676ac2","Type":"ContainerDied","Data":"a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268"} Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.677214 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ddsmw" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.678760 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ddsmw" event={"ID":"19d6d212-d514-4b22-9609-acaf4a676ac2","Type":"ContainerDied","Data":"58ffd5d4eaa59b0a04eb48abefdba202857a40dd5ddcdd2c32f2899a020af5f1"} Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.678879 4878 scope.go:117] "RemoveContainer" containerID="a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.709616 4878 scope.go:117] "RemoveContainer" containerID="738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.756086 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ddsmw"] Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.765058 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ddsmw"] Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.776404 4878 scope.go:117] "RemoveContainer" containerID="97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.813732 4878 scope.go:117] "RemoveContainer" containerID="a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268" Dec 02 20:07:12 crc kubenswrapper[4878]: E1202 20:07:12.814207 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268\": container with ID starting with a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268 not found: ID does not exist" containerID="a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.814350 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268"} err="failed to get container status \"a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268\": rpc error: code = NotFound desc = could not find container \"a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268\": container with ID starting with a85131c7eedd13a76d9db17fd96ba050d1ea1d193b9d81c0420293f4069f4268 not found: ID does not exist" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.814388 4878 scope.go:117] "RemoveContainer" containerID="738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a" Dec 02 20:07:12 crc kubenswrapper[4878]: E1202 20:07:12.814686 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a\": container with ID starting with 738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a not found: ID does not exist" containerID="738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.814720 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a"} err="failed to get container status \"738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a\": rpc error: code = NotFound desc = could not find container \"738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a\": container with ID starting with 738a63a95309ca96176e2942c7e84c6cd024b82c9c64cc10db7573f6b3f9684a not found: ID does not exist" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.814742 4878 scope.go:117] "RemoveContainer" containerID="97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2" Dec 02 20:07:12 crc kubenswrapper[4878]: E1202 20:07:12.815123 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2\": container with ID starting with 97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2 not found: ID does not exist" containerID="97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.815172 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2"} err="failed to get container status \"97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2\": rpc error: code = NotFound desc = could not find container \"97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2\": container with ID starting with 97709135730bf97c4178191419ad49c1e710a79416bb0666edf4e3a65157dbb2 not found: ID does not exist" Dec 02 20:07:12 crc kubenswrapper[4878]: I1202 20:07:12.955461 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" path="/var/lib/kubelet/pods/19d6d212-d514-4b22-9609-acaf4a676ac2/volumes" Dec 02 20:07:15 crc kubenswrapper[4878]: I1202 20:07:15.043363 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-ffbv7_302b5d81-5163-4052-a986-6fbdda49e9cf/prometheus-operator/0.log" Dec 02 20:07:15 crc kubenswrapper[4878]: I1202 20:07:15.209660 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55bb47f485-749dx_baa1d367-077f-4aa3-8dca-5a56cff08838/prometheus-operator-admission-webhook/0.log" Dec 02 20:07:15 crc kubenswrapper[4878]: I1202 20:07:15.286367 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-55bb47f485-h9lpl_95fa4f57-a446-402b-9de4-5ff0d8109802/prometheus-operator-admission-webhook/0.log" Dec 02 20:07:15 crc kubenswrapper[4878]: I1202 20:07:15.437322 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-ctrvg_fee86c2f-ee2c-49c4-a96c-f59e7ef28524/operator/0.log" Dec 02 20:07:15 crc kubenswrapper[4878]: I1202 20:07:15.545626 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-zhq9q_528e5b70-2773-48c2-8382-d4e2ec45933d/observability-ui-dashboards/0.log" Dec 02 20:07:15 crc kubenswrapper[4878]: I1202 20:07:15.638400 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-s7xc9_161358a3-71af-4def-b6a0-0ba9b5f2a7b3/perses-operator/0.log" Dec 02 20:07:18 crc kubenswrapper[4878]: I1202 20:07:18.938294 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:07:18 crc kubenswrapper[4878]: E1202 20:07:18.939015 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:07:28 crc kubenswrapper[4878]: I1202 20:07:28.975794 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/kube-rbac-proxy/0.log" Dec 02 20:07:29 crc kubenswrapper[4878]: I1202 20:07:29.028770 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-846f878689-bhh7m_6ace3da2-70e9-4d80-a8ad-5a8e1bb062df/manager/0.log" Dec 02 20:07:31 crc kubenswrapper[4878]: I1202 20:07:31.937509 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:07:31 crc kubenswrapper[4878]: E1202 20:07:31.938412 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:07:42 crc kubenswrapper[4878]: I1202 20:07:42.938838 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:07:42 crc kubenswrapper[4878]: E1202 20:07:42.939818 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:07:53 crc kubenswrapper[4878]: I1202 20:07:53.937878 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:07:53 crc kubenswrapper[4878]: E1202 20:07:53.938980 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:08:06 crc kubenswrapper[4878]: I1202 20:08:06.938516 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:08:06 crc kubenswrapper[4878]: E1202 20:08:06.940396 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:08:18 crc kubenswrapper[4878]: I1202 20:08:18.938187 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:08:18 crc kubenswrapper[4878]: E1202 20:08:18.939297 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:08:32 crc kubenswrapper[4878]: I1202 20:08:32.940852 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:08:32 crc kubenswrapper[4878]: E1202 20:08:32.943151 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:08:47 crc kubenswrapper[4878]: I1202 20:08:47.938594 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:08:47 crc kubenswrapper[4878]: E1202 20:08:47.939735 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:09:01 crc kubenswrapper[4878]: I1202 20:09:01.938369 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:09:01 crc kubenswrapper[4878]: E1202 20:09:01.939318 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:09:07 crc kubenswrapper[4878]: I1202 20:09:07.347336 4878 scope.go:117] "RemoveContainer" containerID="e14b69342559563b96fb45fa41b6f8b0357962f2b7c8c6558ef8fa174432fec8" Dec 02 20:09:13 crc kubenswrapper[4878]: I1202 20:09:13.286341 4878 generic.go:334] "Generic (PLEG): container finished" podID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerID="f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b" exitCode=0 Dec 02 20:09:13 crc kubenswrapper[4878]: I1202 20:09:13.286432 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8sbrm/must-gather-jktfm" event={"ID":"1c170e0c-9765-42e1-b77c-6c881ef202fa","Type":"ContainerDied","Data":"f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b"} Dec 02 20:09:13 crc kubenswrapper[4878]: I1202 20:09:13.288310 4878 scope.go:117] "RemoveContainer" containerID="f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b" Dec 02 20:09:13 crc kubenswrapper[4878]: I1202 20:09:13.935295 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8sbrm_must-gather-jktfm_1c170e0c-9765-42e1-b77c-6c881ef202fa/gather/0.log" Dec 02 20:09:15 crc kubenswrapper[4878]: I1202 20:09:15.937927 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:09:15 crc kubenswrapper[4878]: E1202 20:09:15.939095 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:09:25 crc kubenswrapper[4878]: I1202 20:09:25.883340 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8sbrm/must-gather-jktfm"] Dec 02 20:09:25 crc kubenswrapper[4878]: I1202 20:09:25.884795 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8sbrm/must-gather-jktfm" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerName="copy" containerID="cri-o://a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059" gracePeriod=2 Dec 02 20:09:25 crc kubenswrapper[4878]: I1202 20:09:25.906405 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8sbrm/must-gather-jktfm"] Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.417179 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8sbrm_must-gather-jktfm_1c170e0c-9765-42e1-b77c-6c881ef202fa/copy/0.log" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.417876 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.438803 4878 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8sbrm_must-gather-jktfm_1c170e0c-9765-42e1-b77c-6c881ef202fa/copy/0.log" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.439529 4878 generic.go:334] "Generic (PLEG): container finished" podID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerID="a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059" exitCode=143 Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.439594 4878 scope.go:117] "RemoveContainer" containerID="a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.439623 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8sbrm/must-gather-jktfm" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.486256 4878 scope.go:117] "RemoveContainer" containerID="f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.522193 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c170e0c-9765-42e1-b77c-6c881ef202fa-must-gather-output\") pod \"1c170e0c-9765-42e1-b77c-6c881ef202fa\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.522385 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jj42\" (UniqueName: \"kubernetes.io/projected/1c170e0c-9765-42e1-b77c-6c881ef202fa-kube-api-access-9jj42\") pod \"1c170e0c-9765-42e1-b77c-6c881ef202fa\" (UID: \"1c170e0c-9765-42e1-b77c-6c881ef202fa\") " Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.546991 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c170e0c-9765-42e1-b77c-6c881ef202fa-kube-api-access-9jj42" (OuterVolumeSpecName: "kube-api-access-9jj42") pod "1c170e0c-9765-42e1-b77c-6c881ef202fa" (UID: "1c170e0c-9765-42e1-b77c-6c881ef202fa"). InnerVolumeSpecName "kube-api-access-9jj42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.579473 4878 scope.go:117] "RemoveContainer" containerID="a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059" Dec 02 20:09:26 crc kubenswrapper[4878]: E1202 20:09:26.579996 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059\": container with ID starting with a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059 not found: ID does not exist" containerID="a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.580050 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059"} err="failed to get container status \"a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059\": rpc error: code = NotFound desc = could not find container \"a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059\": container with ID starting with a357e1c624f7d1e857b65cefcd06cf00bae1b5dedab7c8507a18578fa887e059 not found: ID does not exist" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.580079 4878 scope.go:117] "RemoveContainer" containerID="f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b" Dec 02 20:09:26 crc kubenswrapper[4878]: E1202 20:09:26.583150 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b\": container with ID starting with f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b not found: ID does not exist" containerID="f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.583230 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b"} err="failed to get container status \"f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b\": rpc error: code = NotFound desc = could not find container \"f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b\": container with ID starting with f5a8bb270a202990193c9664b336ea6b1623fe01a4fccfbb87d5e9e62ad3437b not found: ID does not exist" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.627823 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jj42\" (UniqueName: \"kubernetes.io/projected/1c170e0c-9765-42e1-b77c-6c881ef202fa-kube-api-access-9jj42\") on node \"crc\" DevicePath \"\"" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.731002 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c170e0c-9765-42e1-b77c-6c881ef202fa-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1c170e0c-9765-42e1-b77c-6c881ef202fa" (UID: "1c170e0c-9765-42e1-b77c-6c881ef202fa"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.731624 4878 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c170e0c-9765-42e1-b77c-6c881ef202fa-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 20:09:26 crc kubenswrapper[4878]: I1202 20:09:26.950375 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" path="/var/lib/kubelet/pods/1c170e0c-9765-42e1-b77c-6c881ef202fa/volumes" Dec 02 20:09:29 crc kubenswrapper[4878]: I1202 20:09:29.937898 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:09:29 crc kubenswrapper[4878]: E1202 20:09:29.939123 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:09:43 crc kubenswrapper[4878]: I1202 20:09:43.937963 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:09:43 crc kubenswrapper[4878]: E1202 20:09:43.939707 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:09:56 crc kubenswrapper[4878]: I1202 20:09:56.937875 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:09:56 crc kubenswrapper[4878]: E1202 20:09:56.938778 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:10:07 crc kubenswrapper[4878]: I1202 20:10:07.424269 4878 scope.go:117] "RemoveContainer" containerID="ee1c282ca644bfd8eb01c1f50238086b1e4b7e7009ee14657278192b9764389b" Dec 02 20:10:09 crc kubenswrapper[4878]: I1202 20:10:09.938016 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:10:09 crc kubenswrapper[4878]: E1202 20:10:09.938886 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:10:22 crc kubenswrapper[4878]: I1202 20:10:22.937819 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:10:22 crc kubenswrapper[4878]: E1202 20:10:22.938875 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:10:33 crc kubenswrapper[4878]: I1202 20:10:33.939380 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:10:33 crc kubenswrapper[4878]: E1202 20:10:33.941398 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:10:44 crc kubenswrapper[4878]: I1202 20:10:44.937994 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:10:44 crc kubenswrapper[4878]: E1202 20:10:44.938941 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:10:58 crc kubenswrapper[4878]: I1202 20:10:58.938474 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:10:58 crc kubenswrapper[4878]: E1202 20:10:58.939410 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:11:12 crc kubenswrapper[4878]: I1202 20:11:12.938128 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:11:12 crc kubenswrapper[4878]: E1202 20:11:12.939040 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:11:26 crc kubenswrapper[4878]: I1202 20:11:26.938617 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:11:26 crc kubenswrapper[4878]: E1202 20:11:26.939214 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:11:41 crc kubenswrapper[4878]: I1202 20:11:41.939184 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:11:41 crc kubenswrapper[4878]: E1202 20:11:41.940025 4878 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-npvcg_openshift-machine-config-operator(723bfeea-9234-4d2a-8492-747dc974d044)\"" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" podUID="723bfeea-9234-4d2a-8492-747dc974d044" Dec 02 20:11:55 crc kubenswrapper[4878]: I1202 20:11:55.938888 4878 scope.go:117] "RemoveContainer" containerID="826ea548f3d6fb3e086aa95ce3673d975243eb35c45c0902cc57841a4f851798" Dec 02 20:11:56 crc kubenswrapper[4878]: I1202 20:11:56.287000 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-npvcg" event={"ID":"723bfeea-9234-4d2a-8492-747dc974d044","Type":"ContainerStarted","Data":"1f6412b3a7f3346f79c01ecdab2a8a5b0e6fc55a37c6fbfbf66d8fa5b575373f"} Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.332057 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-klwpd"] Dec 02 20:12:10 crc kubenswrapper[4878]: E1202 20:12:10.334054 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="extract-utilities" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.334081 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="extract-utilities" Dec 02 20:12:10 crc kubenswrapper[4878]: E1202 20:12:10.334122 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="registry-server" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.334135 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="registry-server" Dec 02 20:12:10 crc kubenswrapper[4878]: E1202 20:12:10.334172 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerName="gather" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.334212 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerName="gather" Dec 02 20:12:10 crc kubenswrapper[4878]: E1202 20:12:10.334270 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="extract-content" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.334327 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="extract-content" Dec 02 20:12:10 crc kubenswrapper[4878]: E1202 20:12:10.334361 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerName="copy" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.334376 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerName="copy" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.334984 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerName="copy" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.335019 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d6d212-d514-4b22-9609-acaf4a676ac2" containerName="registry-server" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.335043 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c170e0c-9765-42e1-b77c-6c881ef202fa" containerName="gather" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.338361 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.375623 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-klwpd"] Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.492214 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-utilities\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.493329 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfrdc\" (UniqueName: \"kubernetes.io/projected/b8789541-c59a-4f13-8e8a-850d56bb7ec3-kube-api-access-gfrdc\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.493421 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-catalog-content\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.596185 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-utilities\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.596374 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfrdc\" (UniqueName: \"kubernetes.io/projected/b8789541-c59a-4f13-8e8a-850d56bb7ec3-kube-api-access-gfrdc\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.596430 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-catalog-content\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.596714 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-utilities\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.596972 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-catalog-content\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.617977 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfrdc\" (UniqueName: \"kubernetes.io/projected/b8789541-c59a-4f13-8e8a-850d56bb7ec3-kube-api-access-gfrdc\") pod \"redhat-operators-klwpd\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:10 crc kubenswrapper[4878]: I1202 20:12:10.662251 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:11 crc kubenswrapper[4878]: I1202 20:12:11.150211 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-klwpd"] Dec 02 20:12:11 crc kubenswrapper[4878]: I1202 20:12:11.480045 4878 generic.go:334] "Generic (PLEG): container finished" podID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerID="78bce6b21122742235dc1453def7d3044d46b946b4eb3c90ae7f213d1244507d" exitCode=0 Dec 02 20:12:11 crc kubenswrapper[4878]: I1202 20:12:11.480300 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klwpd" event={"ID":"b8789541-c59a-4f13-8e8a-850d56bb7ec3","Type":"ContainerDied","Data":"78bce6b21122742235dc1453def7d3044d46b946b4eb3c90ae7f213d1244507d"} Dec 02 20:12:11 crc kubenswrapper[4878]: I1202 20:12:11.480325 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klwpd" event={"ID":"b8789541-c59a-4f13-8e8a-850d56bb7ec3","Type":"ContainerStarted","Data":"b20c58f9a1d86a0b9ac56b3806808d54d44b5e0a3ea523c5b761f34ab920704e"} Dec 02 20:12:11 crc kubenswrapper[4878]: I1202 20:12:11.482162 4878 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:12:12 crc kubenswrapper[4878]: I1202 20:12:12.493997 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klwpd" event={"ID":"b8789541-c59a-4f13-8e8a-850d56bb7ec3","Type":"ContainerStarted","Data":"efdafdf860978499487e5081c1f2337e81a0142ae2d65af5cea86e3da8ca23c4"} Dec 02 20:12:15 crc kubenswrapper[4878]: I1202 20:12:15.543543 4878 generic.go:334] "Generic (PLEG): container finished" podID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerID="efdafdf860978499487e5081c1f2337e81a0142ae2d65af5cea86e3da8ca23c4" exitCode=0 Dec 02 20:12:15 crc kubenswrapper[4878]: I1202 20:12:15.543619 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klwpd" event={"ID":"b8789541-c59a-4f13-8e8a-850d56bb7ec3","Type":"ContainerDied","Data":"efdafdf860978499487e5081c1f2337e81a0142ae2d65af5cea86e3da8ca23c4"} Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.562347 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klwpd" event={"ID":"b8789541-c59a-4f13-8e8a-850d56bb7ec3","Type":"ContainerStarted","Data":"f343cd50aea4c9088129497232904e4017c1eea7ede45d58db0ef88b9e979b60"} Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.594059 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-klwpd" podStartSLOduration=2.061178837 podStartE2EDuration="6.594034404s" podCreationTimestamp="2025-12-02 20:12:10 +0000 UTC" firstStartedPulling="2025-12-02 20:12:11.481903201 +0000 UTC m=+7041.171522082" lastFinishedPulling="2025-12-02 20:12:16.014758728 +0000 UTC m=+7045.704377649" observedRunningTime="2025-12-02 20:12:16.585653504 +0000 UTC m=+7046.275272415" watchObservedRunningTime="2025-12-02 20:12:16.594034404 +0000 UTC m=+7046.283653305" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.710712 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sqq6t"] Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.716115 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.732328 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqq6t"] Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.768900 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-utilities\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.769096 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-catalog-content\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.769338 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72h6l\" (UniqueName: \"kubernetes.io/projected/6df4fd4e-5133-4e0f-b2eb-eded08897659-kube-api-access-72h6l\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.870946 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72h6l\" (UniqueName: \"kubernetes.io/projected/6df4fd4e-5133-4e0f-b2eb-eded08897659-kube-api-access-72h6l\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.871444 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-utilities\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.871507 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-catalog-content\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.872004 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-catalog-content\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.872012 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-utilities\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:16 crc kubenswrapper[4878]: I1202 20:12:16.893139 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72h6l\" (UniqueName: \"kubernetes.io/projected/6df4fd4e-5133-4e0f-b2eb-eded08897659-kube-api-access-72h6l\") pod \"redhat-marketplace-sqq6t\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:17 crc kubenswrapper[4878]: I1202 20:12:17.054522 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:17 crc kubenswrapper[4878]: W1202 20:12:17.552646 4878 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df4fd4e_5133_4e0f_b2eb_eded08897659.slice/crio-f74028a811830eefa0fd2fb28118ffe77df7cc77717734f46d48edbf470bcdbe WatchSource:0}: Error finding container f74028a811830eefa0fd2fb28118ffe77df7cc77717734f46d48edbf470bcdbe: Status 404 returned error can't find the container with id f74028a811830eefa0fd2fb28118ffe77df7cc77717734f46d48edbf470bcdbe Dec 02 20:12:17 crc kubenswrapper[4878]: I1202 20:12:17.556430 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqq6t"] Dec 02 20:12:17 crc kubenswrapper[4878]: I1202 20:12:17.574909 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqq6t" event={"ID":"6df4fd4e-5133-4e0f-b2eb-eded08897659","Type":"ContainerStarted","Data":"f74028a811830eefa0fd2fb28118ffe77df7cc77717734f46d48edbf470bcdbe"} Dec 02 20:12:18 crc kubenswrapper[4878]: I1202 20:12:18.594604 4878 generic.go:334] "Generic (PLEG): container finished" podID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerID="cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3" exitCode=0 Dec 02 20:12:18 crc kubenswrapper[4878]: I1202 20:12:18.595186 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqq6t" event={"ID":"6df4fd4e-5133-4e0f-b2eb-eded08897659","Type":"ContainerDied","Data":"cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3"} Dec 02 20:12:19 crc kubenswrapper[4878]: I1202 20:12:19.660674 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqq6t" event={"ID":"6df4fd4e-5133-4e0f-b2eb-eded08897659","Type":"ContainerStarted","Data":"65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb"} Dec 02 20:12:20 crc kubenswrapper[4878]: I1202 20:12:20.663133 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:20 crc kubenswrapper[4878]: I1202 20:12:20.663196 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:20 crc kubenswrapper[4878]: I1202 20:12:20.674597 4878 generic.go:334] "Generic (PLEG): container finished" podID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerID="65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb" exitCode=0 Dec 02 20:12:20 crc kubenswrapper[4878]: I1202 20:12:20.674660 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqq6t" event={"ID":"6df4fd4e-5133-4e0f-b2eb-eded08897659","Type":"ContainerDied","Data":"65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb"} Dec 02 20:12:21 crc kubenswrapper[4878]: I1202 20:12:21.686568 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqq6t" event={"ID":"6df4fd4e-5133-4e0f-b2eb-eded08897659","Type":"ContainerStarted","Data":"daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6"} Dec 02 20:12:21 crc kubenswrapper[4878]: I1202 20:12:21.724144 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sqq6t" podStartSLOduration=3.198371347 podStartE2EDuration="5.724120884s" podCreationTimestamp="2025-12-02 20:12:16 +0000 UTC" firstStartedPulling="2025-12-02 20:12:18.597286645 +0000 UTC m=+7048.286905526" lastFinishedPulling="2025-12-02 20:12:21.123036192 +0000 UTC m=+7050.812655063" observedRunningTime="2025-12-02 20:12:21.71557318 +0000 UTC m=+7051.405192091" watchObservedRunningTime="2025-12-02 20:12:21.724120884 +0000 UTC m=+7051.413739775" Dec 02 20:12:21 crc kubenswrapper[4878]: I1202 20:12:21.728589 4878 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-klwpd" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="registry-server" probeResult="failure" output=< Dec 02 20:12:21 crc kubenswrapper[4878]: timeout: failed to connect service ":50051" within 1s Dec 02 20:12:21 crc kubenswrapper[4878]: > Dec 02 20:12:27 crc kubenswrapper[4878]: I1202 20:12:27.055008 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:27 crc kubenswrapper[4878]: I1202 20:12:27.056651 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:27 crc kubenswrapper[4878]: I1202 20:12:27.108924 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:27 crc kubenswrapper[4878]: I1202 20:12:27.837180 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:28 crc kubenswrapper[4878]: I1202 20:12:28.025392 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqq6t"] Dec 02 20:12:29 crc kubenswrapper[4878]: I1202 20:12:29.792002 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sqq6t" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="registry-server" containerID="cri-o://daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6" gracePeriod=2 Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.350326 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.442411 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-catalog-content\") pod \"6df4fd4e-5133-4e0f-b2eb-eded08897659\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.442539 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72h6l\" (UniqueName: \"kubernetes.io/projected/6df4fd4e-5133-4e0f-b2eb-eded08897659-kube-api-access-72h6l\") pod \"6df4fd4e-5133-4e0f-b2eb-eded08897659\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.442690 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-utilities\") pod \"6df4fd4e-5133-4e0f-b2eb-eded08897659\" (UID: \"6df4fd4e-5133-4e0f-b2eb-eded08897659\") " Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.443694 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-utilities" (OuterVolumeSpecName: "utilities") pod "6df4fd4e-5133-4e0f-b2eb-eded08897659" (UID: "6df4fd4e-5133-4e0f-b2eb-eded08897659"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.450351 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df4fd4e-5133-4e0f-b2eb-eded08897659-kube-api-access-72h6l" (OuterVolumeSpecName: "kube-api-access-72h6l") pod "6df4fd4e-5133-4e0f-b2eb-eded08897659" (UID: "6df4fd4e-5133-4e0f-b2eb-eded08897659"). InnerVolumeSpecName "kube-api-access-72h6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.465506 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6df4fd4e-5133-4e0f-b2eb-eded08897659" (UID: "6df4fd4e-5133-4e0f-b2eb-eded08897659"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.545981 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.546047 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df4fd4e-5133-4e0f-b2eb-eded08897659-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.546080 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72h6l\" (UniqueName: \"kubernetes.io/projected/6df4fd4e-5133-4e0f-b2eb-eded08897659-kube-api-access-72h6l\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.722138 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.810101 4878 generic.go:334] "Generic (PLEG): container finished" podID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerID="daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6" exitCode=0 Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.810192 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqq6t" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.810203 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqq6t" event={"ID":"6df4fd4e-5133-4e0f-b2eb-eded08897659","Type":"ContainerDied","Data":"daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6"} Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.810631 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqq6t" event={"ID":"6df4fd4e-5133-4e0f-b2eb-eded08897659","Type":"ContainerDied","Data":"f74028a811830eefa0fd2fb28118ffe77df7cc77717734f46d48edbf470bcdbe"} Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.810668 4878 scope.go:117] "RemoveContainer" containerID="daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.823936 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.838272 4878 scope.go:117] "RemoveContainer" containerID="65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.883618 4878 scope.go:117] "RemoveContainer" containerID="cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.888424 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqq6t"] Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.900779 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqq6t"] Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.950932 4878 scope.go:117] "RemoveContainer" containerID="daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6" Dec 02 20:12:30 crc kubenswrapper[4878]: E1202 20:12:30.951566 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6\": container with ID starting with daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6 not found: ID does not exist" containerID="daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.951607 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6"} err="failed to get container status \"daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6\": rpc error: code = NotFound desc = could not find container \"daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6\": container with ID starting with daa809733d65ccbe16543b72b4ea670cfd9b82397b2610365bac7850abaf78f6 not found: ID does not exist" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.951628 4878 scope.go:117] "RemoveContainer" containerID="65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb" Dec 02 20:12:30 crc kubenswrapper[4878]: E1202 20:12:30.951983 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb\": container with ID starting with 65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb not found: ID does not exist" containerID="65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.952002 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb"} err="failed to get container status \"65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb\": rpc error: code = NotFound desc = could not find container \"65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb\": container with ID starting with 65081207cba74363e8614898a7da222448e23e8aba33622b3380b504c12b05cb not found: ID does not exist" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.952013 4878 scope.go:117] "RemoveContainer" containerID="cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3" Dec 02 20:12:30 crc kubenswrapper[4878]: E1202 20:12:30.952266 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3\": container with ID starting with cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3 not found: ID does not exist" containerID="cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.952284 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3"} err="failed to get container status \"cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3\": rpc error: code = NotFound desc = could not find container \"cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3\": container with ID starting with cac9b27a594e8ae4822cd3d0068790357e8fddfaf8d870d6c6099b656ad582e3 not found: ID does not exist" Dec 02 20:12:30 crc kubenswrapper[4878]: I1202 20:12:30.953832 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" path="/var/lib/kubelet/pods/6df4fd4e-5133-4e0f-b2eb-eded08897659/volumes" Dec 02 20:12:32 crc kubenswrapper[4878]: I1202 20:12:32.609202 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-klwpd"] Dec 02 20:12:32 crc kubenswrapper[4878]: I1202 20:12:32.609813 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-klwpd" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="registry-server" containerID="cri-o://f343cd50aea4c9088129497232904e4017c1eea7ede45d58db0ef88b9e979b60" gracePeriod=2 Dec 02 20:12:32 crc kubenswrapper[4878]: I1202 20:12:32.850005 4878 generic.go:334] "Generic (PLEG): container finished" podID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerID="f343cd50aea4c9088129497232904e4017c1eea7ede45d58db0ef88b9e979b60" exitCode=0 Dec 02 20:12:32 crc kubenswrapper[4878]: I1202 20:12:32.850311 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klwpd" event={"ID":"b8789541-c59a-4f13-8e8a-850d56bb7ec3","Type":"ContainerDied","Data":"f343cd50aea4c9088129497232904e4017c1eea7ede45d58db0ef88b9e979b60"} Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.149260 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.222432 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-catalog-content\") pod \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.222639 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfrdc\" (UniqueName: \"kubernetes.io/projected/b8789541-c59a-4f13-8e8a-850d56bb7ec3-kube-api-access-gfrdc\") pod \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.222891 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-utilities\") pod \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\" (UID: \"b8789541-c59a-4f13-8e8a-850d56bb7ec3\") " Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.223878 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-utilities" (OuterVolumeSpecName: "utilities") pod "b8789541-c59a-4f13-8e8a-850d56bb7ec3" (UID: "b8789541-c59a-4f13-8e8a-850d56bb7ec3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.228973 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8789541-c59a-4f13-8e8a-850d56bb7ec3-kube-api-access-gfrdc" (OuterVolumeSpecName: "kube-api-access-gfrdc") pod "b8789541-c59a-4f13-8e8a-850d56bb7ec3" (UID: "b8789541-c59a-4f13-8e8a-850d56bb7ec3"). InnerVolumeSpecName "kube-api-access-gfrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.324193 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8789541-c59a-4f13-8e8a-850d56bb7ec3" (UID: "b8789541-c59a-4f13-8e8a-850d56bb7ec3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.326023 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.326045 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfrdc\" (UniqueName: \"kubernetes.io/projected/b8789541-c59a-4f13-8e8a-850d56bb7ec3-kube-api-access-gfrdc\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.326058 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8789541-c59a-4f13-8e8a-850d56bb7ec3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.866061 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klwpd" event={"ID":"b8789541-c59a-4f13-8e8a-850d56bb7ec3","Type":"ContainerDied","Data":"b20c58f9a1d86a0b9ac56b3806808d54d44b5e0a3ea523c5b761f34ab920704e"} Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.866133 4878 scope.go:117] "RemoveContainer" containerID="f343cd50aea4c9088129497232904e4017c1eea7ede45d58db0ef88b9e979b60" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.866159 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klwpd" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.906827 4878 scope.go:117] "RemoveContainer" containerID="efdafdf860978499487e5081c1f2337e81a0142ae2d65af5cea86e3da8ca23c4" Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.913698 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-klwpd"] Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.923564 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-klwpd"] Dec 02 20:12:33 crc kubenswrapper[4878]: I1202 20:12:33.931621 4878 scope.go:117] "RemoveContainer" containerID="78bce6b21122742235dc1453def7d3044d46b946b4eb3c90ae7f213d1244507d" Dec 02 20:12:35 crc kubenswrapper[4878]: I1202 20:12:34.955270 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" path="/var/lib/kubelet/pods/b8789541-c59a-4f13-8e8a-850d56bb7ec3/volumes" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.027224 4878 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p55xn"] Dec 02 20:12:57 crc kubenswrapper[4878]: E1202 20:12:57.028045 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="registry-server" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028056 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="registry-server" Dec 02 20:12:57 crc kubenswrapper[4878]: E1202 20:12:57.028083 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="extract-content" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028088 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="extract-content" Dec 02 20:12:57 crc kubenswrapper[4878]: E1202 20:12:57.028103 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="extract-content" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028109 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="extract-content" Dec 02 20:12:57 crc kubenswrapper[4878]: E1202 20:12:57.028128 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="extract-utilities" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028134 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="extract-utilities" Dec 02 20:12:57 crc kubenswrapper[4878]: E1202 20:12:57.028149 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="extract-utilities" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028154 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="extract-utilities" Dec 02 20:12:57 crc kubenswrapper[4878]: E1202 20:12:57.028169 4878 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="registry-server" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028174 4878 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="registry-server" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028423 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8789541-c59a-4f13-8e8a-850d56bb7ec3" containerName="registry-server" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.028443 4878 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df4fd4e-5133-4e0f-b2eb-eded08897659" containerName="registry-server" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.030532 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.040408 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p55xn"] Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.054823 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-catalog-content\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.054947 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-utilities\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.055496 4878 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h282\" (UniqueName: \"kubernetes.io/projected/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-kube-api-access-7h282\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.176261 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-catalog-content\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.176683 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-catalog-content\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.177157 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-utilities\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.176833 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-utilities\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.177470 4878 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h282\" (UniqueName: \"kubernetes.io/projected/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-kube-api-access-7h282\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.204132 4878 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h282\" (UniqueName: \"kubernetes.io/projected/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-kube-api-access-7h282\") pod \"certified-operators-p55xn\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:57 crc kubenswrapper[4878]: I1202 20:12:57.367491 4878 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:12:58 crc kubenswrapper[4878]: I1202 20:12:58.523632 4878 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p55xn"] Dec 02 20:12:59 crc kubenswrapper[4878]: I1202 20:12:59.251900 4878 generic.go:334] "Generic (PLEG): container finished" podID="f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" containerID="0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361" exitCode=0 Dec 02 20:12:59 crc kubenswrapper[4878]: I1202 20:12:59.252393 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p55xn" event={"ID":"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7","Type":"ContainerDied","Data":"0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361"} Dec 02 20:12:59 crc kubenswrapper[4878]: I1202 20:12:59.253066 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p55xn" event={"ID":"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7","Type":"ContainerStarted","Data":"8ebfec2ee12f63bc71d44c4412100a4ceff5df69882c3741725e1db28b300836"} Dec 02 20:13:00 crc kubenswrapper[4878]: I1202 20:13:00.268824 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p55xn" event={"ID":"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7","Type":"ContainerStarted","Data":"148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1"} Dec 02 20:13:01 crc kubenswrapper[4878]: I1202 20:13:01.285028 4878 generic.go:334] "Generic (PLEG): container finished" podID="f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" containerID="148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1" exitCode=0 Dec 02 20:13:01 crc kubenswrapper[4878]: I1202 20:13:01.285130 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p55xn" event={"ID":"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7","Type":"ContainerDied","Data":"148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1"} Dec 02 20:13:02 crc kubenswrapper[4878]: I1202 20:13:02.298523 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p55xn" event={"ID":"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7","Type":"ContainerStarted","Data":"1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca"} Dec 02 20:13:02 crc kubenswrapper[4878]: I1202 20:13:02.358066 4878 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p55xn" podStartSLOduration=3.844836946 podStartE2EDuration="6.358033263s" podCreationTimestamp="2025-12-02 20:12:56 +0000 UTC" firstStartedPulling="2025-12-02 20:12:59.254031602 +0000 UTC m=+7088.943650513" lastFinishedPulling="2025-12-02 20:13:01.767227939 +0000 UTC m=+7091.456846830" observedRunningTime="2025-12-02 20:13:02.32668601 +0000 UTC m=+7092.016304901" watchObservedRunningTime="2025-12-02 20:13:02.358033263 +0000 UTC m=+7092.047652184" Dec 02 20:13:07 crc kubenswrapper[4878]: I1202 20:13:07.368110 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:13:07 crc kubenswrapper[4878]: I1202 20:13:07.368643 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:13:07 crc kubenswrapper[4878]: I1202 20:13:07.426039 4878 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:13:07 crc kubenswrapper[4878]: I1202 20:13:07.492316 4878 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:13:07 crc kubenswrapper[4878]: I1202 20:13:07.670834 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p55xn"] Dec 02 20:13:09 crc kubenswrapper[4878]: I1202 20:13:09.409291 4878 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p55xn" podUID="f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" containerName="registry-server" containerID="cri-o://1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca" gracePeriod=2 Dec 02 20:13:09 crc kubenswrapper[4878]: I1202 20:13:09.972788 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.151906 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-catalog-content\") pod \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.152170 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h282\" (UniqueName: \"kubernetes.io/projected/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-kube-api-access-7h282\") pod \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.152289 4878 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-utilities\") pod \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\" (UID: \"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7\") " Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.153638 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-utilities" (OuterVolumeSpecName: "utilities") pod "f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" (UID: "f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.160551 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-kube-api-access-7h282" (OuterVolumeSpecName: "kube-api-access-7h282") pod "f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" (UID: "f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7"). InnerVolumeSpecName "kube-api-access-7h282". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.210388 4878 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" (UID: "f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.255440 4878 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.255489 4878 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.255531 4878 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h282\" (UniqueName: \"kubernetes.io/projected/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7-kube-api-access-7h282\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.425584 4878 generic.go:334] "Generic (PLEG): container finished" podID="f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" containerID="1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca" exitCode=0 Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.425685 4878 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p55xn" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.425681 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p55xn" event={"ID":"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7","Type":"ContainerDied","Data":"1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca"} Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.427506 4878 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p55xn" event={"ID":"f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7","Type":"ContainerDied","Data":"8ebfec2ee12f63bc71d44c4412100a4ceff5df69882c3741725e1db28b300836"} Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.427544 4878 scope.go:117] "RemoveContainer" containerID="1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.460213 4878 scope.go:117] "RemoveContainer" containerID="148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.490844 4878 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p55xn"] Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.505003 4878 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p55xn"] Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.506484 4878 scope.go:117] "RemoveContainer" containerID="0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.584944 4878 scope.go:117] "RemoveContainer" containerID="1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca" Dec 02 20:13:10 crc kubenswrapper[4878]: E1202 20:13:10.585737 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca\": container with ID starting with 1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca not found: ID does not exist" containerID="1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.585790 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca"} err="failed to get container status \"1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca\": rpc error: code = NotFound desc = could not find container \"1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca\": container with ID starting with 1571054d5c65cf97b0d942e0899e915dd3588432c17dce7ed100fd336c49f6ca not found: ID does not exist" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.585817 4878 scope.go:117] "RemoveContainer" containerID="148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1" Dec 02 20:13:10 crc kubenswrapper[4878]: E1202 20:13:10.586111 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1\": container with ID starting with 148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1 not found: ID does not exist" containerID="148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.586140 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1"} err="failed to get container status \"148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1\": rpc error: code = NotFound desc = could not find container \"148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1\": container with ID starting with 148e8a977f35fc6e618701153bb57b0e9f77c775c01f7347259ee44713c69fc1 not found: ID does not exist" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.586157 4878 scope.go:117] "RemoveContainer" containerID="0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361" Dec 02 20:13:10 crc kubenswrapper[4878]: E1202 20:13:10.586570 4878 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361\": container with ID starting with 0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361 not found: ID does not exist" containerID="0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.586598 4878 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361"} err="failed to get container status \"0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361\": rpc error: code = NotFound desc = could not find container \"0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361\": container with ID starting with 0e215742a13496f40fdde41026b51c571d54ded8199b7a74731739be3c2ad361 not found: ID does not exist" Dec 02 20:13:10 crc kubenswrapper[4878]: I1202 20:13:10.955394 4878 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7" path="/var/lib/kubelet/pods/f37f51c6-fc71-4f1c-ad2c-2cde713ccaf7/volumes"